In this issue: Putting the pressure on consultant fees, energy saving goes unnoticed, China to build 47 airports, fuel-cell flight makes history, and more.
Microprojector writes on retina
By Luke Collins
A company specialising in MEMS (micro electro-mechanical systems) is developing technology to write video images directly on to the human retina. Microvision has used the underlying technology to launch an ultra-miniature projection display unit that it expects will be built into some mobile phones next year, as well as being offered as a standalone device.
Aleksander Tokman, CEO of Microvision, said: "We are developing eyeglasses projecting directly on to the retina. It's very safe, meets stringent regulatory requirements and uses the same projection engine."
That engine is based on a single MEMS mirror, illuminated by red, green and blue LEDs, which vibrates 30 million times a second to create an 852 x 480 pixel display. The projecting eyeglass of the PicoP, which will go into mobile phones, is just 7mm thick, has a volume of about 5cm3 and weighs between 30 and 50g.
Because the display uses LEDs for illumination, there are no focus optics and so the output is always in focus over distances from 20cm to 5m. This means the projector can produce images from the size of a magazine page up to a 2.5m diagonal. Since the illuminating LEDs are only turned on where light of that colour is needed, Microvision claims the projector is also power-efficient, using just 1.5W.
"The key application of this is the spontaneous sharing of video," said Tokman. "There's a real demand for this. The mobile operators also hope that people will download more content over their networks if they can project it."
Earlier this year Microvision said it was working with Motorola to demonstrate a prototype mobile phone with embedded projector to selected customers in order to gauge market demand and customer requirements.
The PicoP engine will cost around $100 as a component to include in another device, while Tokman reckons that a standalone projector will cost around $300. The technology is also being explored for use in head-up displays in cars, where its focus-free nature, high contrast and dynamic range will be appropriate.
High-speed net access 'vital'
Super-fast broadband - next generation access and networks - is crucial to the UK's future, says telecoms regulator Ofcom. In a speech at the IET on 16 April, Ofcom CEO Ed Richards said: "These networks form part of the critical infrastructure of the country's economy and will be central to our future."
The new super-fast access networks will give people anything from 20 to 100Mb/s and more. Virgin Media and BT are both running pilot projects, but a nationwide network would have a very stiff price tag, possibly as high as £15bn. Doubts remain over how quickly the commercial sector would be willing to shoulder the cost.
"In some Far Eastern markets, there has been substantial direct govern-ment intervention to help speed the deployment," said Richards, but he added that there was uncertainty about commercial viability of new services.
To see Ed Richards' speech on iet.tv, visit http://tv.theiet.org/technology/communications/1642.cfm [new window].
Last-mile investment needed in UK
By Dominic Lenton
Fears that the Internet is in danger of collapsing under sheer weight of traffic are misplaced, says an optronics expert, but British operators and the government need to do more if households are to enjoy its full benefits.
The opinion was expressed by Professor David Payne, whose work on developing the fibre amplifiers on which international communications networks rely is in the running for one of the world's biggest technology awards.
Professor Payne, director of the Optoelectronics Research Centre at the University of Southampton and a Fellow of the IET, is one of two British researchers shortlisted for the million-euro Millennium Technology Prize, "Finland's tribute to life-enhancing technological innovation".
Payne is part of a group nominated for their outstanding contributions to telecommunications through the invention in the 1980s of the erbium-doped fibre amplifier. Although he was competing with fellow nominees Professor Emmanuel Desurvire of Thales Corporate Research & Technology in France and Dr Randy Giles of Bell Laboratories in the US, the three have been shortlisted together in recognition of the collective impact they made on global telecoms.
Large-scale adoption of EDFAs has revolutionised the world of high-speed and long-distance communication by providing a way of amplifying optical signals directly without the need to convert them to electric signals.
More than 20 years later, the recent dramatic uptake of streaming media products like the BBC iPlayer has prompted warnings that networks may be unable to cope. Those fears are the result of a "massive misunderstanding", Payne said.
"The UK has failed to provide the bandwidth to the home, but that's not the Internet, that's the last mile," he told E&T. "The Internet has more than enough capacity at the moment."
Payne says that describ--ing the services available to UK homes as 'broadband' is a misnomer, and the term should be applied to data rates of a gigabit per second. Asia Pacific countries have rolled out networks operating at these speeds as an act of faith, but UK operators want a business model.
"The market would argue that the conditions are insufficient to make the investment required," he said. "What we need to do is to stimulate rollout through creative measures in a partnership between government and the finance world that comes up with ways that make sense commercially."
The Millennium Technology Prize is presented bi-annually by the Technology Academy Finland. Nominees must have significantly improved the quality of human life - previous winners include Sir Tim Berners-Lee for his part in creating the World Wide Web.
The overall winner will be named at a ceremony in June. The prize pool is €1.15m, with the winner collecting €800,000, and the other shortlisted 'laureates' awarded €115,000 each.
The list of finalists was revealed earlier this month at simultaneous events in Finland, France, the US and the UK. Payne is joined on the shortlist by Professor Sir Alec Jeffreys of the University of Leicester Department of Genetics, who developed the DNA fingerprinting techniques used in identification of criminal suspects and in paternity and immigration disputes.
The other candidates are Professor Robert Langer of MIT for his work on biomaterials for controlled drug release and tissue regeneration, and Dr Andrew J Viterbi for the Viterbi algorithm, which has become key in wireless and digital communications systems.
Speaking at the Royal Academy of Engineering in London, Payne and Jeffreys agreed that British involvement in two of the four contending innovations was evidence of some factor that helps the country's research base to punch above its weight.
"We do it very efficiently for the amount of money we spend," said Payne. "It's something about the British psyche. Perhaps it comes from our education system."
Client pressure puts squeeze on consultant fees
Consultants have seen their daily rates plummet by almost a fifth in the past year, according to a survey which found coaching and human resources specialists still charge nearly twice as much as telecoms and IT engineers.
Online consultants' network Skillfair has been running its annual survey of UK fees since 2004. The 2008 survey is based on input from more than 500 consultants on the organisation's database. The average daily rate charged was £535, almost 20 per cent lower than the figure of £662 reported last year.
However, rates vary widely by specialism. Change management came out on top with fees of £806 a day, compared with £358 for telecoms. Coaching, human resources and supply-chain logistics consultants were able to charge more than £650, with at least one respondent bringing in over £1,500 per day, while those in engineering and IT solutions failed to break the £500 mark.
Skillfair managing director Gill Hunt acknowledged that negotiating fee rates is something that many consultants find difficult. "The balance between the risk of not getting the work and the benefits of asking for and obtaining a higher rate can be quite fine, and many of us take the easy route and just charge what we've always charged," she said.
In-flight phone calls get closer
By David Sandham
New pan-European rules mean passengers will be able to use their mobile phones on planes before the end of the year. The European Commission introduced the rules this month, covering licensing and technical specifications for GSM communication services on aircraft.
The handsets are intended to connect from the plane (via an on-board picocell, server and modem) to a satellite, but there is a danger that they may erroneously pick up mobile networks on the ground. Because the terrestrial base stations are far away, the phones would transmit at maximum power, which could interfere with the safety of aircraft equipment. To address this, a network control unit in the aircraft will drown out the terrestrial signal with a noise-signal inside the cabin. Also, the minimum height for any transmission has been set at 3,000m.
In-flight mobile communication services have already been tested in France and Australia. On 2 April, Air France and OnAir launched the second phase of a three-month trial, using a specially equipped Airbus A318 and Inmarsat satellites. The service allows up to six simultaneous calls.
A number of telecom operators and airlines are planning to launch services in 2008, with pricing likely to be a key factor in their success. Around 90 per cent of European passengers take their phones on planes.
'Three little pigs' project blows up a storm
Engineers at Cambridge Consultants plan to "huff and puff and blow a house down" as part of a Canadian project to improve the safety of buildings in extreme winds.
The company has developed a way of networking its wind-generation technology to create a huge wind simulator that can generate pressures equivalent to a Category 5 hurricane. It will be put into action this summer at the Insurance Research Lab for Better Homes in London, Ontario, as part of the University of Western Ontario's 'Three Little Pigs' project.
"By blowing the house down, we will be able to provide guidance not only with regard to making them safer, but how to do so economically," said Dr Gregory Kopp, who is leading the project.
"We are going to determine how the rapid changes in pressure and direction of wind cause houses and other light-frame buildings to respond. So far, no one has been able to either simulate this or measure it in an actual storm," Kopp remarked. "Thanks to the breakthrough valve and concept models from Cambridge Consultants, we can now create a realistic simulation of a hurricane that will greatly aid our ability to assess the integrity of the structure of a building, the pathways by which the load is transmitted through the structure, and the performance of components."
The research is expected to lead to more formalised techniques for weather-proofing low-rise buildings and to deliver protection against evolving weather hazards by providing the know-how to improve building codes and quality control.
The wind simulator consists of 70 networked modular pressure actuators mounted against the outside wall of a full-scale two-storey pitched-roof house.
Each actuator has a fast-acting valve system that allows the wind pressure to reverse direction up to seven times a second. The control and networking system co-ordinates the actuators to replicate the complex wind effects over the entire surface of the building.
The purpose of the test is to determine how the rapid changes in pressure and direction of wind cause houses and other light frame buildings to respond.
Manned fuel-cell flight makes aviation history
By Bob Cervi
US plane-maker Boeing has completed the first manned flight using hydrogen fuel-cell power, in a further move to 'green' the aviation industry.
The technology was installed in a two-seater Dimona motor-glider which made three test flights from an airfield in Spain.
A hybrid system of proton-exchange membrane (PEM) fuel cells and lithium-ion batteries powered an electric motor driving the plane's propeller. When the plane reached cruise altitude, the pilot disconnected the batteries and cruised at 120km/h for 20 minutes on power generated by the fuel cells.
According to Boeing, PEM fuel-cell technology could potentially power small manned and unmanned aircraft, producing only water and heat as emissions.
John Tracy, Boeing's chief technology officer, said the flight was a first in the history of aviation. "Boeing recognises that pollution represents a serious environmental challenge," he added.
In the longer term, solid oxide fuel cells could be applied to secondary power-generating systems, such as auxiliary power units for large commercial planes.
Boeing said it did not believe that fuel cells would ever provide primary power for large passenger planes, but it would continue to investigate their potential as well as other sustainable alternative fuel and energy sources that improve environmental performance.
Earlier this year, a Boeing plane was flown from London to Amsterdam with one engine powered by biofuel, a product derived solely from crops. Rival aerospace group Airbus also flew a commercial aircraft for the first time using a mix of aircraft fuel and a synthetic liquid fuel processed from gas.
The aviation industry is under pressure to cut emissions and find alternatives to traditional kerosene fuel. A number of European groups, including Airbus, Rolls-Royce and Saab, have signed up to the EU-led 'Clean Sky' initiative aimed at developing greener aircraft.
The European Union is also supporting 'Solar Impulse', a project in Switzerland to develop a fuelless aircraft powered solely by sunlight.
Verizon wins - but Google calls the tune
By Paul Dempsey
The US 700MHz auction has ended with all the players declaring victory.
The country's two largest cellular carriers, Verizon Wireless and AT&T, took the largest viable blocks of spectrum. They will ultimately use the former analogue TV frequencies for 4G mobile communications based around the LTE and HSDPA+ standards. All as per the original masterplan.
Google did not win any licences, but it did see the value of the C block of spectrum pass a $4.6bn reserve. That reserve was a precondition for the Federal Communications Commission (FCC) to enforce a set of open access conditions Google has been seeking, which will require the victorious C block holder, Verizon, to sub-let spectrum to other users.
The FCC had feared that the US economic downturn would hit the auction hard. Instead, a little over $19bn is headed for Treasury coffers.
At the same time, and unlike the post-3G auction era, nobody is talking about the looming consequences of overbidding, nor about operators having overestimated demand for new wireless services. Even the build-out of new networks is seen as expensive, for sure, but also straightforward - an incremental addition to the infrastructure the carriers already have in place.
The lesson from the US, then, would appear to be that there is a huge amount of money in that old TV spectrum. But the innovative part of the 700MHz story concerns the opening up of capacity to new entrants, new services and new devices, beyond the control of the incumbent telcos. Here, early signs are that the technological and economic path will prove rocky.
When the FCC lifted its gag on bidders' public statements about the auction, Google counsels Richard Whitt and Joseph Faber slapped this on the company's public policy blog: "Our partici-pation in the auction helped ensure that the C Block met the reserve price. In fact, in ten of the bidding rounds we actually raised our own bid - even though no one was bidding against us - to ensure aggressive bidding on the C Block. In turn, that helped increase the revenues raised for the US Treasury, while making sure that the openness conditions would be applied to the ultimate licensee" (our italics).
I don't know about you, but I cannot read that without hearing Dick Dastardly's hound Muttley snicker away in the background. But, it also needs saying that the sniping here is hardly one-sided. The telcos have also - albeit less publicly - been having a pop.
But it's not just that the relationships between traditional and non-traditional operators seem especially tense. Analysts and participants unanimously agree that such open access as the FCC guarantees has been phrased in its regulations in a very loose way. The scope for 'interpretation' (a Washington euphemism for 'lobbying') is enormous. On top of that, the FCC will almost certainly have new leadership by the time any new 700MHz services or agreements emerge towards the end of 2009.
So, other governments might see a framework from which to profit from TV's analogue switch-off, but the lessons about how to manage the aftermath are still to come.
Final sprint for flexi-displays
By Chris Edwards
The makers of flexible displays are racing to have products out on the market by the end of year, aiming for a new generation of devices such as e-book readers and ultra-low-power pocket organisers.
Jennifer Colegrove, senior analyst at iSuppli, argues that this is the year of the flexible display: "Why is 2008 year one of the flexible display? Because, before this year, you could only see demonstrations."
This year, Colegrove said, will see the first shipments of Polymer Visions' Readius, with companies such as Plastic Logic, PVI and LG Displays readying products for delivery by the end of the year.
Konrad Herre, vice president of manufacturing at Plastic Logic, said the company is now moving production equipment into its fab at Dresden, Germany: "The plant will make its first modules available at the end of this year, with the production ramp-up next year."
Polymer Vision's chief technology officer Edzer Huitema demonstrated the Readius at the Printed Electronics Europe conference in Dresden earlier this month following his keynote speech. He said the company is working with a partner to incorporate the phone module so that the device can download information using 3G or USB.
"We are finalising production of the displays and the device itself," said Huitema. Polymer Vision expects to be able to start shipping the device to customers in the middle of the year, with a price tag roughly equivalent to that of a high-end smartphone. The device has a display that folds around the outside of the phone module.
The manufacturers going into production this year are using batch-based techniques derived from those used to make mainstream, and rigid, liquid crystal displays. But others are looking at alternatives. HP is pursuing a roll-to-roll process in a bid to make ultra-cheap displays, although the company is struggling with yields. "We feel that it is feasible to make electronics in a roll-to-roll environment," said Carl Taussig, programme manager at HP. "I am a little anxious that it has taken so long because people have gone to production with batch manufacturing."
Insulator points to one-shot printed circuits
Teams working on plastic electronic devices are turning to new types of material to improve performance, as conventional polymers are too slow. One material developed at the University of Minnesota has proved so good at its job that it could usher in ultra-cheap printed electronics.
Professor Daniel Frisbie's group at the university has developed an insulator for the gates of transistors that uses a similar principle to that used in so-called supercapacitors to improve their dielectric constant.
The materials, which dissolve metal ions in a polymer, work by polarising when a charge is applied. Whereas a conventional gate insulator such as the silicon dioxide used in today's transistors has a dielectric constant of around four, these materials have one closer to 1,000.
Thanks to the high capacitance, the Minnesota team found it was possible to build a completely flat transistor. Instead of having to put the gate on top of the transistor's conducting channel, it can be placed to the side.
"How can it work with a gate placed off to the side?" asked Frisbie. "It is because we have this incredibly polarisable medium. It is a distinct advantage when you consider the printing process, because you can print all of the device in one step."
"The devices made this way are slower," Frisbie admitted. "Nevertheless, they work, and with some optimisation we can improve their performance."
China to build 47 airports in ten years
By William Dennis
China intends to build 47 new airports across the country from 2011-2020 as part of its plan to develop the country's air transport industry.
The investment of 320billion yuan (US$43.5bn) was approved recently by the State Council in Beijing under the National Civil Airports Distribution Plan (NCADP).
The airports will be built in five regions - northern, eastern, central south, south-east and north-west. The investment is in addition to the 140 billion yuan already being spent under the 2006-2010 five-year plan to build 48 airports, expand 71 existing facilities and relocate 11 others.
The second Beijing Airport, which was approved in March, is a separate project and will be funded separately.
When the NCADP project is completed in 2020, there will be 244 airports, serving 82 per cent of a population that is projected to rise from 1.3 billion to 1.47 billion over that period.
Wang Changshun, deputy director of the General Administration Civil Aviation (GACAC) of China told E&T in Beijing recently that under NCADP provincial capitals and those in autonomous regions directly under the central government will have better access to air transport.
"With the booming economy, more Chinese citizens are affluent and want to fly," Wang said. There is a particular need for more airports in the less-developed western region.
Malaysia rethinks contract
By William Dennis
The Malaysian government may allow more telecommunications companies to participate in a proposed $4.75bn high-speed broadband services project.
The contract, which was initially awarded to Telekom Malaysia in September, is to be rolled out over ten years, with about a third of the funding coming from the state.
The government was supposed to sign the deal with Telekom Malaysia in February but it was called off. No reasons were given. A Ministry of Finance official was only willing to say that it is being reviewed.
Several companies had voiced their displeasure that such a huge contract was awarded to one company. They argue that the proposed network should be built by a consortium of companies, so that each could use it to roll out their services.
For the government to award the contract to one company is against the Communications and Multimedia Act 1998, which promises more competition and less regulation.
Energy saving goes unnoticed
By Christine Evans-Pughe
Following a four-month laboratory trial with 100 domestic fridge-freezers, a British start-up called RLtec has announced successful initial findings for 'dynamic demand' control - a way of adjusting the energy requirements of electrical appliances according to the ups and downs of the national grid's AC frequency.
RLtec is using a software algorithm within the temperature control loop of the fridges that turns the compressor off when the supply frequency falls (indicating too much demand) and turns it on when the frequency rises (indicating too much supply), as long as the fridge is in its operating zone. "The technology works in any appliance where you have stored energy and discretion over its use," explained RLtec's CEO Andrew Howe.
Using this approach, RLtec hopes to be able to sell a low-carbon automatic energy-balancing service to grid operators based on populations of thousands of fridges. The idea is to replace some of the UK's balancing capacity, which today mainly comes from power stations burning fossil fuels.
RLtec's announcement follows the news last year that the Department of Business Enterprise and Regulatory Reform (BERR) will be funding a study led by Imperial College that will involve installing fridges fitted with dynamic demand controllers and data-loggers in UK homes. "We will construct a new assessment platform to test a dynamic demand model derived from the empirical data, and pave the way for a new market in intelligent demand-side techniques," said Michael Hill-King, Imperial's programme manager for industrial research collaboration.
With millions of domestic electrical appliances such as fridges and air conditioners sitting around with no need for their energy to be delivered on a precise timescale (give or take a few minutes), there is growing interest in recruiting them not only for 'greener' energy balancing but for taking the strain off the grid during times of peak demand and even helping to integrate intermittent power sources, such as the wind, into national grids.
While RLtec's algorithm makes fine second-by-second adjustments to the power drawn by the fridge, the US Department of Energy's Pacific Northwest National Laboratory (PNNL) has developed a small controller board designed to shed load, like an electricity substation.
PNNL recently published a report on the results of its first residential trials of the 'Grid Friendly' controller. In a year-long collaboration with the domestic appliance company Whirlpool, PNNL put Grid Friendly Appliance boards into 150 dryers and 50 water heaters in 150 homes in Washington and Oregon. Whenever PNNL's controllers detected the 60Hz grid frequency dipping to 59.95Hz they shut off the heating element for two minutes. If after that time the grid was still unstable, the controllers turned off the element for another two minutes and so on for up to ten minutes. In the dryers, the motor stayed on, keeping the drum turning so the clothes didn't wrinkle.
At the end of the year, most of the residents said they hadn't noticed their dryers and water heaters switching on and off. Robert Pratt, PNNL's manager for the programme, said: "We fully expected this would be beneath the threshold of awareness of most if not all customers, and it was."
PNNL plans to conduct a larger pilot with 1,000 or more homes next year.
Whirlpool is also keen to test the concept in more homes and experiment with other functions that could be safely interrupted, such as the automatic defrost function in fridges, according to JB Hoyt, Whirlpool's director of government relations. "We're also looking to get a grid operator involved, as they are the biggest beneficiaries of load-balancing appliances," he added.
Meanwhile, encouraged by the results of its fridge trial so far, RLtec is putting a team together including people from the UK National Grid to test the concept further.
Supercomputer helps design HIV therapy
Researchers at the University of Edinburgh and IBM have embarked on a project that will combine supercomputing simulations with laboratory experiments to design drugs aimed at inhibiting infection by the HIV virus.
The project is focused on how the human HIV-1 virus attaches to cells in the body and injects its genetic material. Researchers are examining a fragment of the surface protein of the virus, known as a peptide, which is crucial in stimulating the body's immune response to viral attack.
Understanding the structure and behaviour of the peptide allows the simultaneous design of multiple inhibitor drugs capable of targeting the infection process. This multi-strike approach, it is believed, will prevent the virus from mutating and thereby invalidating the drug therapy, as it does with single inhibitors.
Pioneering atomistic simulation methods and software will run on the university's massively parallel IBM BlueGene/L supercomputer, in conjunction with high-accuracy experimental measurement techniques, to probe the properties of amino acids and small peptides, the 'building blocks' of proteins. Such investigations are key to anti-viral therapy based on the simultaneous development of multiple targets.
"Early results show we can use computers to simulate which molecules can stop the HIV virus from infecting humans," said Jason Crain of the University of Edinburgh's School of Physics and divisional head of science at the National Physical Laboratory. "Drug makers could then use this information to develop those drugs more rapidly."
The five-year project brings together bio-medical, supercomputing, and scientific measurement expertise in "a gratifying, multi-disciplinary collaboration", added Professor Crain.
Researchers announce sensor breakthrough
Researchers at the US National Institute of Standards and Technology (NIST) have demonstrated an imaging system that detects naturally occurring terahertz radiation.
Biological and chemical samples emit characteristic signatures of terahertz radiation (300GHz-3THz), but detecting and measuring them is a unique challenge because the signals are weak and absorbed rapidly by the atmosphere.
At the heart of the NIST prototype imager is a tiny device that measures incoming terahertz radiation by mixing it with a stable internal terahertz signal. This mixing occurs in a thin-film superconductor, which changes temperature upon the arrival of even a minute amount of radiation energy. The slight frequency difference between the two original terahertz signals produces a more easily detected microwave frequency signal.
The system can detect temperature differences smaller than half a degree Celsius.