Excuse me for my ignorance on the subject, but I thought gamma died with CRTs and all RGB values on images are intensity representations on a linear scale. Am I wrong? Is sRGB used for publishing or something?
The transition from CRTs to LCDs didn't happen at the flick of a switch, it happened gradually. The signals between the computer and the display remained the same, based on standards maintaining backward compatibility.
I don't know if gamma was a happy accident or a result of clever engineering, but it has a benefit even today. The eye's response to light is not linear, and a gamma-corrected signal has nearly even steps of brightness between each of the 256 levels. A linear signal does not, and would require more bits for an acceptable display.
sRGB is used for encoding pretty much all JPEGs and other format bitmaps too. (Although some are in higher gamut colour spaces like AdobeRGB or DCI-P3, as many displays support this wider range of colours too.)
If you want to be efficient with your encoding, you'll always use a gamma-like function. This is because our eyes have a logarithmic response to brightness, and the gamma is essentially an inverse of that.
After reading up on it I'm so glad the implementation is transparent and I don't have to deal with it, but I'm also really dissatisfied with the discrete and imperfect nature of lookup tables and perceived brightness approximations.
LCDs still have their own inherent "gamma" curve which is actually a sigmoid shape. They just emulate CRT gamma for backwards compatibility and hide their own low level details in the process.
I built a RGB LED Display and at first implemented brightness levels by just PWM modulating in a linear way. I quickly found out that it was lacking in dark shades or that the bright shades were almost indiscernible. The solution was gamma correction which made the shades look distributed evenly across the brightness range.