Modern language standards are explicit about what the operations do and modern architectures all have two variants of right shift (arithmetic and logical, in x86 parlance) to handle the difference. This isn't a problem in practice for compilers in the modern world, though it remains a good warning for developers writing their own optimizations.
Really it's a note from a world we've forgotten, where "all" computer operations were unsigned and 2's complement math was a clever trick deployed only occasionally.
It's true the 2s complement wasn't always universal but von Neumann proposed it in the EDVAC report (1945). It was the IBM 360 (1964) which really championed it. The PDP-8, 9, 10 and 11 all used 2s complement. By 1976 when this MIT AI Lab report was written, 2s complement was pretty standard.
I think this is just the MIT AI Lab tilting at windmills. They had their quirks like wanting the world to call pixels pels well after the world had decided on the former.
https://godbolt.org/z/2PAP5c