In addition to 100ms being very noticable, the problem with such "it isn't noticeable" arguments is that n such implementations get layered and soon you have a pretty egregious situation.
As a lazy example, I happen to have Excel open on macOS -- whenever I try to format a cell the format screen takes just shy of 4 seconds to appear. How is that even remotely possible? Tens of billions of instructions per second...leaves me waiting whenever I set a single cell to display as a number.
Memory utilization has the same problem. Those many allocations are meaningless individually, but pretty soon we have an idle and unnecessary "Adobe Desktop Service" sitting in the background soaking up almost 2GB.
As a lazy example, I happen to have Excel open on macOS -- whenever I try to format a cell the format screen takes just shy of 4 seconds to appear. How is that even remotely possible? Tens of billions of instructions per second...leaves me waiting whenever I set a single cell to display as a number.
Memory utilization has the same problem. Those many allocations are meaningless individually, but pretty soon we have an idle and unnecessary "Adobe Desktop Service" sitting in the background soaking up almost 2GB.