Great question. The granularity of your synchronization impacts the user experience, and sometimes the product offering itself. It can go from "replace the whole object" all the way to "string operations are merged without conflicts to match user intention", with "centralized resource locking" somewhere along the way.
The goal product-wise is to ensure intention preservation, but that is hard to achieve in general as intention is tied to what the product does.
For instance, if you have an integer and an increment operation, say, in a HN-like app to track upvotes. If your synchronization simply resets modified values with a newest-win approach, the following situation would lose an upvote: you have 5 upvotes, A and B both upvote you simultaneously, each setting the number of upvotes to 6. The server tells A that the number of upvotes has been set to 6, then tells B the same thing, and they are both happy. However, product-wise, it should have been 7 instead.
That common approach is both elegant and tricky. It requires very precise clocks and time synchronization, as some non-commutative operations are order-dependent (such as list insertion).
That is why Google's TrueTime API (introduced in Spanner[0]) is such a big deal.
Actually, for most use cases, newest update wins is sufficient if it can be done on a fine granular basis (one property of an object/document).
It's what web applications have been doing for forever and being a 'realtime' framework doesn't change this if your use case isn't something like google docs.