Glory Days of Cathode Rays

The Glory Days of Cathode Rays

- Dinos -
- What killed CRTs? -
- Myths of inherent superior image quality from LCDs -
- Myth 1: Sharpness -
- Display tech and its affect on FPS standards -

Dinos

Many call objects springing from old technology "dinosaurs". However, CRT displays are rather more akin to dinosaurs than other old technology. For, while the dinosaurs "ruled the earth" for longer than most other reptilia, CRTs "ruled home displays" for much longer than any other dispay technology.

By the time personal computers were able to output RGB, CRTs were pretty sophisticated. During the 90's, advancements happened especially rapidly, as media work utilizing computers became more common.

What killed CRTs?

Yet, like the supposed meteor that abruptly drew the curtains on tyrant lizard theatre, the practically linear graph of improvements in iterative flagship CRT monitor specs came to a sudden end.

Of course, the "meteor" that wiped out the mighty, blade-toothed CRTs was flatscreens.

When the first consumer flatscreens hit the market, they essentially all utilized lyquid crystal display technology.

Now, for a new type of tech to obliterate consumer production of another nearly century old one in barely a decade, there surely has be good reason in some form, no?

Well, here are the inherent advantages LCDs have over CRTs:

Now, to the vast majority of people, those are pretty darn appealing assets. Their flexibility allowed for the development of greatly more portable devices like laptops and pdas, for one (whether or not that's a good thing on the whole is a topic for another article).

Yet, some of you may have gone something like, "Wait, that's it? Weren't there any advantages in picture quality?"

Let's dig deeper, shall we?

Myths of inherent superior image quality from LCDs

Myth 1: LCDs are sharper

Readers, by this point, it may be becoming apparent to you that I'm a malenky bit partial to CRTs. However, there was a practice which occurred during the era of CRT monitors I'm rather not fond of- inflation of the native resolution.

To understand what this is and why it's a problem, you first have to understand how CRTs work a bit:

A CRT monitor contains millions of tiny red, green, and blue phosphor dots that glow when struck by an electron beam that travels across the screen to create a visible image. The illustration below shows how this works inside a CRT.



The dot pitch is what defines the resolution of a CRT. The beam has to pass through each slot in the shadomask or aperture grille accordingly in order for proper image fidelity.

Let me use an example to explain. Let's say you have

Display tech and its affect on frame rate standards

60 FPS is considered my most to look "less cinematic" because it's never displayed properly. The soap opera bluriness people describe is to be blamed on LCDs, not the increases frame rate. LCD panels and projectors are indeed capable of displaying 24 fps without bluriness. But, pixel transitions just cannot happen fast enough for 60 to look sharp. Various test sites/programs prove this.

Under construction

Feel free to contact me with feedback or questions

Back to the front page

writing for this article began in July, 2020