Just as LCD and plasma TV's become mainstream, a new wave of display technologies has arrived to supplant them.
If you are like me, happy to use a liquid crystal display (LCD) to watch words take shape, happy with the LCD on my mobile phone, but still reluctant to let an LCD rule my living room, our waiting may soon be over.
Don't get me wrong. I know the latest generation of large-screen LCD and plasma TVs offers resolutions that make my cathode ray tube (CRT) TV a laughing stock. I could use the space that a flat-panel TV would create in my living room, and would like its cool looks. I could even live with the prices. Yet despite the advantages of relegating the CRT to a back room, there are two drawbacks that have kept me from taking the plunge.
The first is the lack of high-definition (HD) content to go with an HD-ready set. Even if I replaced my DVD player with a Blu-ray disc box, the title line-up in the format is still limited. My cable TV provider isn't in a hurry to launch an HDTV service, either. And Buenos Aires isn't London, so I can forget about ATC (Argentina's equivalent of the BBC) launching 'ATC HD' any time soon.
The second drawback is image quality. Why settle for LCD or plasma when, even if I bought the most advanced model on offer, I would be replacing my CRT TV with inferior technology? Yes, I would get many more pixels per centimetre of screen. But it's the quality of those pixels that counts and, no matter how hard the makers of LCD and plasma displays try, CRT technology is still unrivalled when it comes to factors such as contrast ratio.
There's little that engineers can do to accelerate the provision of HD content by studios and broadcasters. But some of the technical shortcomings of LCD and plasma TVs are about to be swept aside by a variety of emerging flat-screen display technologies.
For many years, engineers have been working on technologies that can match, and in some cases, improve upon, the form factors of large-screen LCD and plasma displays, while offering the contrast ratios, colour fidelity, brightness and other quality parameters previously confined to CRTs.
There are three reasons why these technologies are taking so long to become commercially available: technical challenges, unrealistic pricing, and commercial conflicts of interest.
In December, Sony launched the first commercial display to be based on one of these technologies. Still only available in Japan and the US, the Sony XEL-1 is the world's first organic light-emitting diode (OLED) television.
OLED research began in the 1980s, although it was only in the past three to five years that the technology started to find its way into small displays on mobile phones, digital cameras, car radios and portable digital media players.
The biggest advantage of OLED displays is that they don't need a backlight, unlike most high-performance LCDs. In OLED displays, the electroluminescent layer is made of a film of carbon-based compounds, which are deposited by a printing process. The organic layer is sandwiched between two conductors and two sheets of glass, and emits light when a voltage is applied to the conductors.
Doing away with the backlight simplifies manufacturing, enables thinner displays and cuts power consumption - in theory. OLED displays were first used in battery-powered devices partly because of their reputation for drawing less power than LCD screens. However, as screen size increases, engineers are having difficulties translating the energy-efficiency promises of OLED into reality.
The XEL-1, for example, whose screen measures only 28cm on the diagonal, is rated by Sony as consuming 34W. A detailed examination of the device carried out by a group of engineers and commissioned by Tech-On, a division of Nikkei Business Publications, found that average power consumption for a display showing a white screen was 28.4W, and 18.3W when it showed black.
For comparison with LCD technology, Sony's 48cm Bravia M series LCD consumes 50W. Katsuji Fujita, the former president and chief executive officer of Toshiba Matsushita Display Technology, was quoted as saying that, for screen sizes larger than 76cm, OLEDs currently consume two to three times more power than LCDs.
What's undeniable in OLED displays is their ability to adopt almost ridiculously thin form factors. The screen of Sony's XEL-1 is 3mm thick, not bad for the first commercial use of the technology. Sony has already unveiled a screen prototype that has the same diagonal measurement but is ten times thinner, at just 0.3mm thick. It has also produced a 69cm prototype OLED TV offering Full HD (1920 x 1080) resolution, with the same contrast ratio as the XEL-1.
OLED screens produce better images than rival technologies, including CRTs. The XEL-1 has a contrast ratio of 1,000,000:1, a figure that one gadget website followed by a bracket saying "not a printing error". This contrast ratio makes 'absolute blacks', one of the holy grails of HD displays and one of the greatest shortcomings of LCDs, possible. CNET reviewer David Katzmaier says of the Sony screen: "Blacks produced by this TV are absolute and visually indistinguishable from the black frame around the screen in a dark environment."
So, I'm finally ready to go and buy one of these, right? Not quite. For all the advantages that OLEDs bring, there are still some commercial and technical barriers to overcome before these products will be widely attractive.
Price is the biggest issue. Sony's XEL-1 is $2,500, enough to make even an early adopter blanch. Prices should start to drop once economies of scale kick in, although Sony has so far only been able (or willing) to make a couple of thousand XEL-1 units each month, in Japan and the US.
The biggest technical issue (apart from power consumption) is that OLED screens wear out too quickly. So far, the materials used in OLEDs will only work for between 10,000 and 14,000 hours before they begin to degrade. That means that a TV that is in use for an average of seven hours a day will have a maximum lifetime of just five-and-a-half years.
Despite these pricing and technical issues, Toshiba, Samsung, Panasonic and LG are all expected to bring OLED technology to market at some point over the next three years.
The second display technology challenging the large-screen status quo is laser TV. One of the main advantages of laser displays is that laser beams can be combined to produce about 90 per cent of the colours that humans can see, compared with around 50 per cent for LCDs and plasmas.
The reason we're not all enjoying the benefits of laser's wide colour gamut is that it has been too expensive to use in a rear-projection TV. Mitsubishi claims to have solved the cost issue and is about to launch the world's first commercial laser TV. Drawing less than 200W, it should be about twice as power-efficient as a similar-sized LCD, and three times as efficient as similar-sized plasma screens.
Talking about size, Mitsubishi - unlike Sony in the OLED camp - isn't wasting any time and will be targeting the really large-screen sector from the word go. The Japanese vendor's debut laser TV, dubbed LaserVue, will come in 165cm and 185cm versions. The first units should ship this autumn.
Sceptics highlight two main drawbacks of laser TV technology: a potential blinding hazard due to high-power lasers; and a technical glitch known as image speckle noise caused by the narrowband light source.
Mitsubishi insists it has overcome both, by using integrated diffusion filters to reduce the threat from high-power lasers, and proprietary technology to tackle the speckling issue.
The first LaserVue TVs will feature a 120Hz refresh rate and will apparently be competitively priced against comparable LCD and plasma alternatives. But more work will be needed to reduce the 25cm depth of the cabinet.
The third display technology with a chance of dethroning the current incumbents is the field emission display (FED). It combines the image quality of CRTs with the slim shape of flat panels.
To achieve this, FEDs borrow from CRTs the idea of using a beam of electrons to light up a phosphor layer. In CRTs, the beam from an electron gun is swept across the phosphor coating of the tube's front glass to paint the image. In FEDs, carbon nanotubes are deposited to form a static two-dimensional array of millions of tiny electron guns that each illuminate a fixed patch of phosphor. Apart from shrinking the depth of the cabinet, to around 50mm judging by the first emerging screens, this allows for better energy efficiency than LCDs and plasmas. Contrast ratios in excess of 20,000:1 are also possible.
FED technology has been in development for decades. In 2006 Sony felt it was mature enough to spin off an independent company, called Field Emission Technologies, to commercialise the company's FED research. The Tokyo spin-off introduced its first product, a 49cm display called nano-Spindt FED, last year. It is being marketed for broadcasting, film-making, medical and other professional applications first. This will be a relief to broadcasting professionals, who are finding that the CRT master monitors they rely on to evaluate the quality of video signals are going out of production. As with any other phosphor-based screen technologies, FED users will have to look out for the susceptibility of the display to phosphor burn-in.
There's one more pretender to the flat panel crown, and that's surface-conduction electron-emitter displays (SEDs). These are a variant of FED technology, with each phosphor pixel being illuminated by a single tiny cathode ray tube, rather than millions of carbon nanotubes.
SED developers claim similar benefits in terms of power consumption, image quality, response times and thickness to those creating FEDs. But a continuing patent dispute involving Canon, Toshiba and a firm called Nano-Proprietary is making industry insiders pessimistic about SED's chances of joining OLED, laser TV and FED in the battle to conquer living rooms around the world. Mine included, of course.