Is an LR better than a parser/combinator.
Seems like there are pluses to rolling your own parser/combinator over the time to integrate with a yacc or other library.
I am a bit biased (in the sense that I wrote my own SLR parser generator twice, in C# and F#!), but the benefit of LR is that you can know whether your CFG contains conflicts, which is a game changer for preserving backwards compatibility when you change the syntax later during the life of the DSL.
But if a parser combinator library may support converting the resulting combined parser into an eBNF grammar, and check whether that grammar contains conflicts.
Hum... A grammar conflict on parser combinators always require that you step down from the parser abstraction and resolve it by hand with basic language functionality, doesn't it? It's something quite hard to notice.
Are you concerned with the documentation getting out of sync with the actual language?
In a situation where I'm adding a new feature to the language, with a new syntax that causes me to add a new rule to the grammar, I'm concerned that the new rule will accidentally "capture" some code that already exists in the wild, that was previously derived by another part of the grammar.
To give a very ugly example, if you have a language with function calls f(expr, expr, expr) and you want to add tuple syntax to expressions with a brand new rule:
expr := expr COMMA expr
Then you might have accidentally turned all functions into unary functions, as the tuple rule captures the "expr, expr, expr" part and leaves you with f(expr).