It feels as if this is still related to rejection sampling, in spite of not being quite that. If the algorithm samples some bits, and still hasn't made a decision, then it's as if all the previous samples have been "rejected". The difference seems to be that this algorithm has a memory of previously rejected samples, which causes it to be less likely to reject subsequent samples. "Adaptive rejection sampling" appears to be relevant: https://en.wikipedia.org/wiki/Rejection_sampling#Adaptive_re...
Yes, although the algorithm here just accepts a certain small amount of bias rather than reject. But the "corrected" version of this says that if I am rolling a 1dN the first random byte will have N-1 different "rejection" criteria, the next byte and every byte after will only have 1 "rejection" criterion, and so on.
Something that would be more interesting: if you had a random number primitive which could generate 1 with probability 1/2, 2 with probability 1/6, 3 with probability 1/12, and so on (p = 1/(N * (N+1))), that primitive would allow you to construct random irrational numbers uniformly sampled from [0, 1) as continued fractions. Since the rationals ℚ are a non-dense subset of ℝ, ignore them as measure-0.
Such a primitive would allow you to do a sort of rejection-sampling with only a finite number of rejections before you are guaranteed an answer, as a continued fraction for a rational has a finite expansion.
It feels as if this is still related to rejection sampling, in spite of not being quite that. If the algorithm samples some bits, and still hasn't made a decision, then it's as if all the previous samples have been "rejected". The difference seems to be that this algorithm has a memory of previously rejected samples, which causes it to be less likely to reject subsequent samples. "Adaptive rejection sampling" appears to be relevant: https://en.wikipedia.org/wiki/Rejection_sampling#Adaptive_re...