This article isn't very convincing to me. I mean, I one hundred percent buy that eventually consistent stream processing systems can theoretically be subject to unbounded error. But eventual consistency isn't just a theoretical model. It's also a practical engineering decision, and so in order to evaluate its use for any given business purpose we have to see how it performs in practice. That is, what is the average/99.9%/max error? And we have to understand how business-critical the correct answer is. This article has some great examples of theoretical issues with eventually consistent stream processing computation, but it doesn't demonstrate that any real systems evince these problems under any given workload.
> Not all is lost! There are stream processing systems that provide strong consistency guarantees. Materialize and Differential Dataflow both avoid these classes of errors by providing always correct answers
yeah i was expecting to see what tradeoffs materialize made to get 'always correct' result. There is definitely something 'lost' for 'always correct' too.
I can only attribute this one sided take to deviousness. Personally , I would avoid whatever this company is selling.
Do you, like, work for Materlize? You're awful connected to many of the folks there and this is a pretty plant-like statement to make in the comments section for a piece by the company.
"Please don't post insinuations about astroturfing, shilling, brigading, foreign agents and the like. It degrades discussion and is usually mistaken. If you're worried about abuse, email hn@ycombinator.com and we'll look at the data."