Send your letters to The Editor, E&T, Michael Faraday House, Six Hills Way, Stevenage, Herts SG1 2AY, UK, or to email@example.com. We reserve the right to edit letters and to use submissions in any other format.
Rail before renewables
At last I read of government circles questioning the value for money paid in subsidies to developers of wind farms and for photovoltaic cell installations to produce ‘renewable’ electricity. Would it not be better for the Department for Energy to seek to identify industrial installations that do not make efficient use of electricity and if necessary make some contribution towards costs incurred in improving efficiency?
A major offender is Network Rail, with its 750V DC third-rail network in London and the South East. Last year their chief electrical engineer disclosed that of the electrical energy supplied to the third-rail system at winter weekday peak travel periods, some 25 per cent is lost in the conductor rails.
I have no idea of the magnitude of the DC network electricity load but if I make a stab at around 500MW it means that some 125MW of generator output capacity as well as fuel is wasted. These losses occur at the same times as the winter maximum demands on the national electricity grid and could be reduced to about 7 per cent (90MW less) if the DC system were to be replaced by 25kV overhead supply, or even better if the 25kV system were connected using the auto-transformer system creating effectively a very efficient 50kV railway-supply network.
It is surely a no-brainer to suggest that the Departments of Energy and Transport and Network Rail get together to develop a long-term changeover policy for DC/AC conversion over a period of, say, 25 to 30 years, during which time there would be a dual 750V DC/25kv system in operation until changeover is completed.
The changeover will cost money. Could this be borrowed by the Treasury, with the capital charges, interest etc paid with funds currently used to subsidise renewables?
The reduction in railway energy losses making more generating and transmission capacity available for the increasing grid electricity demand at peak load would be far more worthwhile expenditure and meet the same requirements of reducing fuel consumption and consequently carbon output. There would be railway operational advantages too. Higher volts at the pantograph would allow better acceleration and higher speeds if required, both leading to higher track capacity. Also reduced winter delays and cancellations due to iced-up conductor rails.
R Eaton MIET
How to end LED disappointment
‘Leading Lights’ (September 2012) raises a number of issues regarding domestic LED lighting, and the point made regarding “disappointed users” is a significant one.
There is no doubt that the end user for domestic LED lighting is ill-informed about lamp performance in terms of longevity and light output. It would now seem that the producers of LEDs at the ‘lower’ end of the market have an interest in keeping it that way. Purchasers will usually base their selection on price, reassured by claims of a life of thousands of hours on the box. Their disappointment is justified when their purchase suffers poor light output or early mortality.
Few of these individuals can be expected to understand the technical specifications of the equipment and are therefore unable to distinguish between the top and bottom ends of the market. Simply buying the most expensive in the hope that it equates to the better lamps leaves the end-user at the mercy of the commercially driven retail market.
If LED lamps are not to suffer from a deteriorating reputation, there should be a response to the comment by Iain Macrae, president of the Society of Light & Lighting and global technical manager of Thorn Lighting, that “high quality players will… tell you the truth…” by developing some credible mechanism for presenting their argument.
Is there not, therefore, a case to set up some kind of standard regarding performance for domestic LED lighting, and should not the IET be championing such a system?
David Swain MIET
In my experience, a critical and failure-prone part of modern ‘eco’ lighting technologies such as fluorescent tube and LED lamps is the power supply. Fluorescent tube replacement lamps for bayonet and Edison screw lampholders have complex inverters and switching power supplied integrated into the base of the lamp, and it is usually this part that fails. The longevity of the tubes themselves is one thing and is presumably measured without regard to the integrated power supply.
I do not have much experience yet with LED lamps and LED retro-fit replacements, but I presume that the actual diode junction that emits the light does not run at mains voltage so there must be some sort of power adapter involved, with all its associated temptations for manufacturers to skimp on quality, while advertising the product based on the data pertaining to the actual light transducer.
There seems to be a lack of reporting or professional interest in the integral power supplies of these, and very many other, products. Power supply technology is a vast industry in its own right and appears to be dominated by anonymous far-eastern manufacturing.
I believe that heat is a major cause of failure in small SMPSUs jammed into tiny packages (failed lacquer insulation on toroids and bust capacitors are commonly seen). I would be interested in more reporting into power supplied. “The engines of equipment”, as my old chief engineer used to say.
Tristan Thorne MIET
HVDC 40 years on
In his remarks on the occasion of the opening of Alstom Grid’s new MaxSine valve factory in Stafford (News, July 2012), Dr Tom Calverly accurately described the collegial and co-operative atmosphere that existed among the manufacturers of HVDC equipment in the 1960s and 1970s, to the benefit of potential customers.
As project manager for what turned out to be the world’s first large solid-state HVDC facility, the Eel River HVDC Converter Station, I found this atmosphere to be most refreshing, though at the time I did not believe it could exist. We conducted simulation studies with each of the prospective suppliers to assure that each understood our specification and that we in turn could understand the language of each tender. In the end we had four excellent alternatives from which to choose.
The 1968 IEE Manchester HVDC Conference transcript became our book of reference for the project, and in turn I reported on the performance of the Eel River station at a 1973 conference in London. I recommend the upcoming 2012 Birmingham event, documenting the state of the art, to anyone contemplating an HVDC transmission project.
In retrospect I must now confirm the opinion I developed at the time – line commutation HVDC converter technology (current source commutation) is a brute force technique for converting DC to AC. In simplest terms, this is because a mercury arc valve or a thyristor cannot be turned off once conduction is triggered; large bits of energy left over from making sine waves out of square ones must be filtered out by massive and expensive harmonic filters and synchronous condensers if one is to avoid disrupting wireline telecom service.
Voltage source converter technology seems to be a much more elegant and potentially less costly technique due to the ability of the insulated gate bipolar transistor to be turned both on and off on command.
That having been said, the Eel River HVDC Converter station was completed on time and on budget in August 1972 and has given 40 years of yeoman service. In February of 2011 it became the 11th Canadian project to be recognized as an engineering milestone by the IEEE.
Frank H Ryder CEng FIEE FIET
A question of choice
When I lived in Kuala Lumpur several decades ago the supply authority, Lembaga Letric Negara, employed a very simple method for reducing peak maximum demand. In residences, our electric stove and our storage-type water heater were controlled via an ironclad two-pole, two-throw switch so that both could not be on at the same time. This rarely caused any inconvenience. Sometimes simple solutions are the best.
Stuart Bridgman MIET
Wellington, New Zealand
DAB channel selection
I wonder why the design of so many DAB radios is poor in respect of channel selection ergonomics? The most common arrangement requires three operations to recall a preset channel: press the presets button, rotate the volume control knob through the list of preset channels while looking at the display or counting the clicks, then press the volume control knob axially.
I am sure any competent radio designer could do better than this with little effort or development cost. This could even result in a slight reduction in manufacturing costs together with a new advertising opportunity.
R H Pearson