You are typically not watching them at the same distance either, and the fact that TV screens are interlaced might also affect the flickering effect (I understand 1990s monitors weren't).
I remember that my 67Hz macintosh monitor was really stressing my eyes as a teenager, while I never had this problem with a TV.
My first computer was a TRS-80 Color Computer 2. Neon green screen with "black-ish" text, 32 col x 16 rows. Cursor was a rainbow flashing block. I'm still not certain whether someone with epilepsy couldn't be triggered into a seizure by that screen...
Now imagine being 10 years old, and having that in front of your face while coding in BASIC, glowing from a 19" TV mere inches in front of you...hehe. That was me!
I won't claim one way or the other to not have suffered damage of some sort - I'm really not sure.
Strangely, my mom always complained to me about "sitting to close to the TV" - but had no problem when I was in front of this thing for hours on end, typing away, etc.
You're misunderstanding interlacing. Interlacing was a way to still keep 50Hz (or 60Hz in some parts of the world) while having twice as many rows. It didn't give you a higher refresh rate than that.
No, but the difference in scanning pattern does provide a quite stark perceptual difference between 15KHz interlaced and 31KHz progressive at 60Hz. The scanning of fields seems to provide a higher chance that field 1's bottom lines won't have disappeared by the time the monitor is scanning field 2's bottom lines. Of course, it's largely dependent on a monitor's specific phosphor persistence, and why high-persistence monitors were so common in the earlier days of personal computing.
I remember that my 67Hz macintosh monitor was really stressing my eyes as a teenager, while I never had this problem with a TV.