...
They're mostly limited to just 60Hz VGA input and traditional Mac resolutions were mostly above that mark so they'd be "flicker free," which was a concern back in the day for one reason or other.
...
Not to derail the thread but the way I remember it was this... because of the way CRTs work (scanning each line left to right, top to bottom) a refresh rate of 60 Hz, which is not necessarily terrible for smoothness, the subtle changes in brightness from the electron beam caused a not too noticeable flicker leading to eye strain and headaches etc. Lower the refresh rate and it becomes more noticable. Higher refresh rates are less noticeable.
The problem usually comes (in my experience) from working on a 60 Hz refresh CRT for long periods of time because most offices were lit by inexpensive fluorescent lighting which also flickers at 60 Hz (in countries where the electricity also cycles at 60 Hz.) This can aggravate the eyestrain and headaches some people experience and is mitigated somewhat by higher refresh rates. An 85 Hz refresh rate was nice (to me, anyway).
IIRC most mac monitors used 67 Hz or higher.
So why are LCD panels for computers at 60 Hz usually? Well, LCDs work much differently than CRTs. They don't flicker due to scan lines. A pixel usually stays at it's color and brightness value as long as it's refreshed quickly enough and 60 Hz is plenty quick enough.
:beige: