vol 8, issue 2

Engineering Grand Challenges

11 February 2013
By Aasha Bodhani, Jason Goodyer, Abi Grogan, James Hayes, Mark Venables, Vitali Vitaliev
Share |
Grand challenges graphic

The 14 grand challenges and progress to date





On 12-13 March the IET hosts a major international summit in London organised by the national engineering academies of the UK, US and China to discuss progress on 14 'grand challenges' identified five years ago by America's National Academy of Engineering. We look at what they are and how close the world is to solving them.

Make solar energy economical

If we are to move away from fossil-fuel-driven energy solutions then the burden will need to be taken up by renewable energy sources, and the most bountiful among these is, of course, the power of the sun.

Over the period 2000-11, solar PV was the fastest growing renewable power technology worldwide. Cumulative installed capacity of solar PV reached roughly 65GW at the end of 2011, up from only 1.5GW in 2000.

Concentrated solar power (CSP) is a re-emerging market. Roughly 350MW of commercial plants were built in California in the 1980s; activity started again in 2006 in the United States and Spain. At present, these two countries are the only ones with significant CSP capacity, with about 1GW and 500MW installed respectively, and more under construction or development.

According to International Energy Agency (IEA) analysis, under extreme assumptions solar energy could provide up to one-third of the world's final energy demand after 2060.

There can be no doubt that, on paper at least, solar power is an attractive proposition. Its availability far exceeds any conceivable future energy demands. But exploiting the sun's power is not without challenges.

Overcoming the barriers that could potentially slow down widespread solar power generation will require engineering innovations in several arenas - for capturing the sun's energy, converting it to useful forms, and storing it for future use when the sun itself is obscured.

Many of the technologies to address these issues are already in hand. But it all comes down to cost.

Commercial solar cells, most often made from silicon, typically convert sunlight into electricity with an efficiency of less than 20'per cent, although some test cells do a little better. Given their manufacturing costs, modules of today's cells incorporated in the power grid would produce electricity at a cost roughly six times higher than current prices.

To make solar economically competitive, engineers must find ways to improve the efficiency of the cells and to lower their manufacturing costs.

Further reading

http://eandt.theiet.org/magazine/2012/08/photo-essay-solar-power.cfm

http://eandt.theiet.org/magazine/2011/03/supersize-solar.cfm

http://eandt.theiet.org/magazine/2011/07/crazy-glazing.cfm

Provide energy from fusion

Nuclear fusion is on the cusp of becoming a viable way to solve our impending energy crisis, but its actual application is currently still far on the horizon.

One major barrier preventing the achievement of worthwhile levels of fusion is the durability of the structural materials used to house a fusion reactor.

While energy-rich neutrons are responsible for the main source of energy extracted from a fusion reaction, they also convert atoms in the chamber wall and surrounding blanket into radioactive material. This further weakens materials that are already working to withstand temperatures of up to 100 million degrees, preventing the confinement of radioactivity and the easy disposal of nuclear waste.

The secondary engineering challenge is economically achieving fusion for a sustained period of time. Fusion was first achieved for a significant period by the Culham Centre for Fusion Energy in its JET (the Joint European Torus) facility in 1997, producing 16MW of power, a record that has yet to be broken almost 20 years on.

JET is currently the largest tokomak in existence, but is soon to be eclipsed by a successor project ITER (International Thermonuclear Experimental Reactor), a combined research project between the United States, the European Union, Japan, Russia, China, South Korea and India.

ITER is destined to become the first tokomak to provide a sustained pulse of energy, producing up to 500MW of power.

Main areas of research aim to address the instability of nuclear fusion, including the launch of the International Fusion Materials Irradiation Facility to research into potential new materials for use in fusion plants.

Inroads are already being made into improving the magnetic forces used to contain the fusion chemical ingredients as they combine and react. To unlock this practically unlimited supply of energy, considerable advances will need to be made into improving these superconducting magnets, advanced vacuum systems and structural materials, as well as developing robust robotic systems for the repair and maintenance of the reactors.

Further reading

http://eandt.theiet.org/magazine/2011/05/steve-cowley.cfm

http://eandt.theiet.org/magazine/2013/01/how-to-energy.cfm

http://eandt.theiet.org/news/2011/aug/rossi-reactor.cfm

http://eandt.theiet.org/contribute/energy/future-fusion.cfm

Develop carbon sequestration methods

Increased carbon dioxide released into our atmosphere has a lot to answer for. Rising sea levels, increased storms and failed crops are just some of the side effects produced by its release, and it is predicted that one trillion tonnes of the stuff will need to be buried by various means of carbon capture and storage (CCS) before 2100.

Various commercial methods of capturing carbon dioxide are already in mainstream use, included in the process of dry ice manufacturing and the manufacture of carbonated beverages. This process could be adapted for carbon capture within coal-burning plants by replacing smokestacks with two absorption towers; the first would remove CO2 from remaining gases using absorption chemicals, while the second would separate the carbon dioxide from these absorption chemicals so they could be used again in the process.

To make this process more energy efficient, coal could also be burned in pure oxygen as opposed to the usual mix of normal air, eradicating the need to separate the carbon dioxide from the nitrogen.

The second step in CCS is the storage of carbon dioxide. Several potential environments have been identified by scientists and engineers as appropriate storage grounds for carbon dioxide, but none are currently foolproof sites.

Depleted oil and gas fields are an attractive prospect for storage as the carbon dioxide can be used to obtain remaining oil trapped deep in the rock sediment. Sedimentary brine formations 800m into the ground are also a viable option as the high pressure deep underground will help to keep the carbon dioxide in high density.

Unfortunately, both of these locations are prone to faults within the rock that could provide the carbon dioxide with the opportunity to leak out into the atmosphere, so engineers must design robust new systems to prevent this escape.

A third, more costly yet more reliable option, is to inject the carbon dioxide beneath the ocean floor. Although this process is more complex and therefore more expensive, the advantage is that carbon dioxide leakage is not an issue.

Further reading

http://eandt.theiet.org/explore/students/2011/carbon-capture-careers.cfm

http://eandt.theiet.org/magazine/2011/07/carbon-capture.cfm

http://eandt.theiet.org/magazine/2011/08/deep-down.cfm

http://eandt.theiet.org/magazine/2009/10/carbon-consultant.cfm

Manage the nitrogen cycle

The nitrogen cycle is a natural process that takes place in four-fifths of the Earth's atmosphere, and it is integral to a living organism's healthy production of proteins and DNA. However, the type of nitrogen produced by humans, which has doubled since the industrial revolution, is fixed nitrogen, which is extremely difficult to break down into a useful resource outside of plant roots and lightening storms.

Innovative agricultural supply chains will need to be forged in order to maintain a sustainable food system. This reduces the overall effect of the nitrogen cycle on the environment, in turn reducing the rate of fixed-nitrogen produced, or improving the rate at which it is broken down into organic nitrogen. The engineering challenge is to come up with a viable way to produce organic nitrogen. Finding new ways to control nitrous oxide release into the air while fossil fuels are being burnt will also be a challenge for future engineers.

High-yielding crops, while producing more food from a single plant, rely heavily on rich fertiliser, which contributes significantly to the nitrogen cycle. If fertiliser were used more efficiently, for example by reducing run off and erosion of fertiliser during plant growth, then more than the current amount - only half - of the more difficult to break down fixed nitrogen would end up in harvested plants.

The next challenge is to create a viable way to prevent the leakage of fixed nitrogen in farming, which can occur anywhere in the supply chain from the field and feeding the animals, to the sewage plant destroying the waste. Improved waste recycling also holds the key to a marked reduction in the amount of fixed nitrogen released into the air.

Manure from cows and other livestock represents an ideal nutrient-rich fertiliser, but engineers must come up with a sustainable method of producing manure pellets to overcome issues such as logistics. The livestock also pose an environmental issue as they produce high levels of methane that is also damaging to the environment. Therefore new ways of reducing the release of greenhouse gases such as methane from such waste, and using these gases as a useful resource, must also be considered.

Further reading

http://eandt.theiet.org/magazine/2012/04/playing-god.cfm

http://eandt.theiet.org/magazine/2012/11/fighting-food-poverty.cfm

http://eandt.theiet.org/magazine/2011/10/high-rise-hopes.cfm

Provide access to clean water

Access to clean water would seem to be one of the fundamental needs for modern life, but the water that many of us take for granted is still not available to everyone around the world. Many women and children, particularly in rural areas in developing countries, spend hours each day walking miles to collect water from unprotected sources such as open wells, muddy dugouts or streams.

In urban areas they collect it from the polluted waterways that surround the towns, or pay high prices to buy it from vendors who obtain it from dubious sources. The water is often dirty and unsafe, but they have no alternative.

Diarrhoeal diseases caused by unsafe water and poor sanitation, such as cholera, typhoid and dysentery, are common across the developing world – killing 4,000 children daily. People suffering from these diseases or caring for children who are suffering from them are often unable to work to earn money, yet face large medical bills.

Total global investments in water and sanitation would need to double for the Millennium Development Goal targets of halving the proportion of people living without water and sanitation by 2015 to be met.

But water for drinking and personal use is only a small part of society's total water needs – household consumption usually accounts for less than 5 per cent of total water use. In addition to sanitation, a large proportion of the water we use is for agriculture and industry.

Technologies are being developed to improve recycling of wastewater and sewage treatment, for instance, so that water can be used for irrigation or industrial purposes.

A different technological approach to the water problem involves developing strategies for reducing water use. Agricultural irrigation consumes enormous quantities of water; in developing countries, irrigation often exceeds 80 per cent of total water use. Improved technologies to more efficiently provide crops with water, such as drip irrigation, can substantially reduce agricultural water demand. Water loss in urban supply systems is also a significant problem.

Further reading

http://eandt.theiet.org/magazine/2012/11/the-drop-in-demand.cfm

http://eandt.theiet.org/magazine/2012/08/photo-essay-solar-power.cfm

http://eandt.theiet.org/magazine/2012/10/water-crisis-solved.cfm

http://eandt.theiet.org/magazine/2010/11/who-owns-water.cfm

Advance health informatics

Concerned with the acquisition, management, analysis and use of medical information, health informatics is a far-reaching field taking in everything from personal medical records to data concerning diseases.

The cost of disk space has been falling over the last 30 years, leaving many industries grappling with information overload, leading to the emergence of 'big data'. Health informatics is no different. As the amount of data grows, software must offer clinicians access to information relevant to each patient as well as access to archival medical research material, and a decision support system, while remaining mindful of the pitfalls such as breach of patient confidentiality and the misuse of data by medical insurers or employers.

Another problem lies with bringing the old, largely paper-based, system of record keeping up to date with a new computerised system. A task easier said than done considering many of the programmes used to store data are incompatible, sometimes even those within the same hospital. Future systems must be engineered to facilitate the sharing of data across all of the different systems in use in the various departments to create a fully integrated regional, national and global health informatics network.

Methods of data collection are, similarly, in a state of great change. Soon wearable devices that monitor pulse, temperature or other important measurements could be embedded within clothing or even the body. These sensors could contain transmitters and receivers to monitor a patient's state, whether in a hospital or at home, and alert medical staff if complications or problems arise, or even tell them to administer drugs.

On a larger scale, the power of health informatics could be harnessed to combat the outbreak and spread of disease. A viral threat such as Avian flu H5N1 could spark a global pandemic. Early warning systems that monitor data on hospital visits and orders for drugs or lab tests are already in place in some countries but more sophisticated methods are required. New strategies for producing vaccines in large quantities must also be devised, perhaps using faster cell culture methods.

Further reading

http://eandt.theiet.org/magazine/2010/11/patents-on-genetics.cfm

http://eandt.theiet.org/magazine/2008/08/analysis.cfm

http://eandt.theiet.org/magazine/2012/11/e-health-keep-taking-the-tablets.cfm

http://eandt.theiet.org/magazine/2009/10/magic-of-mobile-phones.cfm

Engineer better medicines

The sequencing of the human genome in 2003 represented a true paradigm shift in biology and medicine, the effect of which is likely to be felt for many years.

Human DNA contains more than 20,000 genes, which are largely the same in all humans. A small number, less than 1 per cent, differ from one individual to the next giving us our own identity, personality and appearance. These differences also give rise to unique elements of brain and body chemistry and can predispose people to certain illnesses or alter the way in which they respond to medications. Knowledge of a person's unique genetic make-up, therefore, can be used to tailor drugs to meet an individual's unique needs potentially leading to a new era of personalised medicine.

Standing in the way of this, however, are several challenges: collecting and managing the huge amounts of data required; developing better systems to assess a patient's genetic profile; and creating inexpensive diagnostic devices that can detect minute amounts of chemicals in the blood.

Currently medication is often prescribed incorrectly leading to the development of resistance to drugs without any associated benefit. The production of faster, more effective diagnostic methods would help in the prompt screening of larger numbers of drugs leading to the more efficient application of the appropriate treatment and lower this effect.

Traditionally antibiotics are chosen that attack a wide range of bacteria as clinicians cannot always be sure which bacteria are actually causing a given problem. Analytical methods that pinpoint the exact nature of an infection could facilitate the use of more narrowly targeted drugs reducing the risk of the bacteria developing resistance. Certain viruses too may be combatted by engineering small molecules to attack their RNA and preventing them from reproducing.

Elsewhere, advances in the field of synthetic biology mean it may soon be possible to regenerate tissue or organs or even grow replacements from scratch. Researchers in nanotechnology may soon be able to design systems that are hosted by the body and release, say, insulin when the blood glucose levels are high.

Further reading

http://eandt.theiet.org/magazine/2010/17/medical-technology.cfm

http://eandt.theiet.org/magazine/2008/07/medicine-strides.cfm

http://eandt.theiet.org/magazine/2008/13/lifes-secrets-0813.cfm

http://eandt.theiet.org/magazine/2008/11/cloning-cures.cfm

http://eandt.theiet.org/magazine/2008/13/bioengineering0813.cfm

Restore and improve urban infrastructure

Infrastructure is the lifeblood of any city, delivering water and power, removing waste and allowing efficient movements of its inhabitants and goods. But many of our established cities are serviced by ageing infrastructure, much of which can trace its roots back to the Victorian age. Keeping all these assets in excellent working order is not a new test, but it is a growing challenge.

Vast amounts of the existing infrastructure are buried, posing several problems for maintaining and upgrading it. One major challenge will be to devise methods for mapping and labelling buried infrastructure, both to assist in improving it and to help avoid damaging it.

A project of this sort is now underway in the UK, with the aim of developing ways to locate buried pipes using electromagnetic signals from above the ground. The idea is to find metallic structures capable of reflecting electromagnetic waves through soil, much as a reflector makes a bicycle easier to see at night.

Other major infrastructure issues involve transportation. Streets and highways will remain critical transportation conduits, so their maintenance and improvement will also remain an important challenge. But the greater challenge will be engineering integrated transportation systems, making individual vehicle travel, mass transit, bicycling, and walking all as easy and efficient as possible.

While such services can help support growing urban populations, they must be accompanied by affordable and pleasant places for people to live. Engineers must be engaged in the architectural issues involved in providing environmentally friendly, energy-efficient buildings both for housing and for business.

But in this constrained financial age, funding major projects is not easy. Numerous policies and political barriers must be overcome. And so, a major grand challenge for infrastructure engineering will be not only to devise new approaches and methods, but to communicate their value and worthiness to society at large.

Further reading

http://eandt.theiet.org/magazine/2009/07/two-cities.cfm

http://eandt.theiet.org/magazine/2012/06/smart-transport.cfm

http://eandt.theiet.org/magazine/2012/06/smart-cities.cfm

http://eandt.theiet.org/magazine/2009/21/model-for-cities.cfm

Reverse engineer the brain

On 11 May 1997 IBM's purpose-built chess computer Deep Blue defeated its human opponent, world champion Garry Kasparov, two games to one with three draws. It was a huge, symbolic victory for artificial intelligence, and a future populated with robotic waiters and Terminator-style supersoldiers seemed only a few years away.

The reality, of course, proved different. But some researchers are now taking on the challenge of creating artificial intelligence from a different angle: by starting with study of the human brain and working backwards.

The ever-increasing processor power driven by Moore's Law could see computer simulations of the brain, and other things, getting more detailed and more accurate. By studying how the brain itself learns, researchers may be able to design computer processors that can handle multiple streams of information at the same time rather than the one at a time approach currently employed. But progress is hindered by the brain's innate complexity. Each nerve cell in the brain receives impulses from tens of thousands of others, tracing the path of any given signal is extremely difficult.

However, there are already examples of artificial intelligence benefitting from reverse engineering of the brain. Researchers are looking into patients with a damaged hippocampus, the area of the brain responsible for memory, and learning, which can lead to the electric signals between nerves cells that is required for forming and recalling memories. Engineers have begun designing chips that mimic the brain's communication system.

Researchers at Duke University Medical Centre in the US have taught rhesus monkeys to control a robotic arm using only signals from their brains and visual feedback from a screen. In the future the same technology may be applied to improve the neuroprosthetic limbs for use by people who have been paralysed.

Researchers say that the technology they have developed could also improve rehabilitation of those with brain and spinal cord damage due to strokes, diseases or trauma.

Further reading

http://eandt.theiet.org/magazine/2008/05/the-mind.cfm

http://eandt.theiet.org/magazine/2012/06/smart-humans.cfm

http://eandt.theiet.org/magazine/2009/02/i-think-therefore.cfm

http://eandt.theiet.org/magazine/2009/09/smart-robots.cfm

Secure cyberspace

The challenge of securing cyberspace is shifting from the provisioning of methodologies for safeguarding key organisational and personal assets (valuable and sensitive data, say) against cyber criminals, hackers, hacktivists, enemy agents, and other online threats. It now aspires to a baseline discipline governing the specification and design of all computerised systems that must henceforth be protected.

In an increasingly connected world in which more physical devices are Web-ready - therefore targets for cyber threats – securing cyber space often means making it safer for devices that rarely interact with humans, but that can have a direct impact on their well-being and safety.

Security expert Corey Nachreiner of WatchGuard suggests that there is now a heightened likelihood that the next 12 months will see the first cyber-attack that results in a human death. The accelerated proliferation of both networked devices and online threats will create a 'perfect storm' of vulnerable connected systems that, if targeted, could increase the chances of a 'fatal malfunction' – and the first human death as the result of a cyber-attack.

Networked road vehicles, Internet-ready medical devices, and intelligent buildings, are among the emerging connected physical domains that may start to be hit by the end of 2013. These devices often form part of a nation's critical national infrastructure, and cyber protection at this scale reminds us of the necessity to encourage a successive generation of cyber security careerists equal to the task.

A further challenge lies in the advancement of defensive measures that proactively counter malicious cyber activity, so that human (and financial) resources can be deployed where they are most needed. Seculert, for example, develops threat detection tools that use Big Data analytics to inform enterprise cyber security strategy.

On a broader scale another cyberspace security challenge is the development of techniques by which collective intelligence about hackers, virus/malware propagators, and other 'threatscape' actors, can be used predictively to forewarn prospective victims about an impending strike.

Further reading

http://eandt.theiet.org/news/2012/oct/hague-cybercrime.cfm

http://eandt.theiet.org/news/2012/nov/netwars-cybercity.cfm

http://eandt.theiet.org/explore/reports/it-security/index.cfm

http://eandt.theiet.org/news/2012/dec/cyber-reserve.cfm

http://eandt.theiet.org/news/2012/oct/it-police.cfm

http://eandt.theiet.org/news/2012/oct/eu-cyber.cfm

Prevent nuclear terror

In the 67 years that have followed the dropping of Little Boy on Hiroshima, the necessary components for the construction of an atomic weapon have been accumulating across the globe. Since then eight countries, the US, the Russian Federation, the UK, France, China, India, Pakistan and North Korea, have successfully detonated nuclear weapons. But with many countries using nuclear material as a source of power and some warheads potentially not secure from theft or sale, the threat of nuclear attack by rogue nations or highly organised terrorist organisations is ever present.

The challenges this state of affairs presents for engineers are manifold. Most simply an information system to keep track of all nuclear weapons and material is needed. Secondly, the global community has to be able to ensure that a nation using nuclear material in power plants is not extracting plutonium for use in weapons in contravention of international law.

One possible solution is the development of a passive monitoring device near to a reactor that would transmit real-time data about the contents of the reactor. A further problem of detection is the ease with which freight can be shipped around the world. Some ten million shipping containers enter the US each year with each one capable of holding as much as 30 tonnes. Finding a few kilograms of weapons grade material among all of the garden furniture, children's toys and other miscellanea making its way from A to B is like finding the proverbial needle in a haystack.

One solution, which has been nicknamed the nuclear car wash, involves scanning the containers as they go along a conveyer belt. As they pass through the scanner, they receive pulses of neutrons, a subatomic particle used to induce nuclear reactions. The neutrons would induce fission in any weapons-grade nuclear materials within the container which in turn would produce radioactive substances that would emit gamma rays that could be detected by the scanner.

Other problems which need addressing in the coming years include rendering potential devices harmless and also how to go about cleaning up should a nuclear attack take place.

Further reading

http://eandt.theiet.org/news/running/nuclear-policy.cfm

http://eandt.theiet.org/magazine/2011/04/nuclear-perception.cfm

http://eandt.theiet.org/magazine/2012/11/in-the-shadow-of-the-cold-war.cfm

http://eandt.theiet.org/magazine/2012/10/cuban-missile-crisis.cfm

http://eandt.theiet.org/magazine/2011/08/one2ten-unusual-weapons.cfm

http://eandt.theiet.org/news/2012/aug/nkorea-nuclear.cfm

Advance personal learning

Rigid exam systems can limit scope for personalised learning, but at least courses can rise to the challenge of exploiting digital technology to adapt to the individual needs of students so that each can progress in their own way.

Adaptive learning has emerged to cater for this, combining Web technologies with interactive methods to tailor courses to the student. This has gone beyond the basic adaptive testing methods where each question gets harder until the student fails, or the single point online teaching process presenting material in response to the pupil's recent on-screen activities.

A number of companies, such as New York-based Newton, have developed platforms that respond in real-time to the activity of each user on the system, and adjust to provide the most relevant content according to more complex analysis of test scores, speed, accuracy, delays, keystrokes, click-streams and drop-offs.

It goes further than adaptive testing, which assesses the student's current state of knowledge, by then assessing what material or activities would help that student best progress further. There are various algorithms used for adaptive learning, but they all analyse a student's performance according to multiple data points, drilling down into different concepts to recommend follow up actions – in other words taking the role of a personal human tutor.

A related development is the digital textbook – an attempt to get away from the rigid book form, which is often out of date by the time it is published. One of the most advanced national digital textbook programs started in South Korea in 2007, now being tested in primary schools with plans to start free nationwide distribution this year. Digital textbooks will become essential materials for adaptive learning, capable of incorporating a core curriculum component; but marrying this with individual students as they interact with it online, so that each person has the perception that the book has been written just for them. The digital textbooks will combine text, reference material, and dictionaries with multimedia contents such as video clips, animations, and virtual reality, linked to individual student workbooks.

Further reading

http://eandt.theiet.org/magazine/2008/09/learning-curves.cfm

http://eandt.theiet.org/news/2013/jan/google-raspberry-pi.cfm

http://eandt.theiet.org/explore/students/2012/learning-from-others.cfm

http://eandt.theiet.org/magazine/2011/06/leadership-development.cfm

Enhance virtual reality

Virtual reality (VR) has obvious application in training for military personnel, surgeons, and airline pilots in safety-critical fields where it is too dangerous to let novices loose on the real thing, and this is where the VR technology largely established itself.

As 3D simulation capability advanced and became allied with sensory and mechanical feedback techniques, VR has expanded into other fields such as medical treatment and industrial design. There has been spectacular progress in addressing the challenge of treating phobias and stress disorders, as in a US trial conducted in 2011 on victims of PTSD (Post Traumatic Stress Disorder) caused by incidents while serving in the Afghanistan conflict. VR simulation of the incidents was combined with physiological monitoring and training in methods to come to terms with the disorder. About 70 per cent of participants who received the VR therapy showed a clinically significant improvement in 10 weeks, compared with 12.5 per cent in conventional treatment involving psychotherapy alone. The VR method was a form of 'stress inoculation' combining training with controlled exposure in an environment the patient knows to be safe.

VR is being applied to industrial design at the UK's Virtual Design Enterprise Centre at Wolverhampton University. In this context VR is an evolution of computer-aided design, where the aim was to visualise a product, and so avoid some obvious mistakes before incurring the higher costs and time lags in creating a model or prototype.

VR extends this to simulation of product characteristics such as aerodynamics or friction, as well as giving a greater sense of the final product to enable aesthetic or design aspects to be enhanced.

Another emerging application for VR to master is in surgery, both for training and giving patients options in life-like detail. Traditionally, surgeons have trained on cadavers, dummies, or animals; but these are not ideal for advanced endoscopic procedures, where VR can simulate the movements much more accurately. The UK's Golden Jubilee National Hospital near Glasgow recently set-up a virtual 3D surgical training program along these lines for medical students.

Further reading

http://eandt.theiet.org/magazine/2008/14/virtual-reality.cfm

http://eandt.theiet.org/explore/students/2012/dr-romano.cfm

http://eandt.theiet.org/magazine/2009/04/virtually-there.cfm

http://eandt.theiet.org/magazine/2008/12/our-virtual-future-0812.cfm

Engineering the tools of scientific discovery

The astronomical telescope, invented in the 17th century, was one of the great early examples of precision engineering leading directly to great scientific discoveries, in this case through high resolution optics.

The development of the Hubble Space telescope took this co-operation between the two disciplines to a new level, as there was little scope for repairing or upgrading the equipment in space. But the telescope did develop a fault that would have compromised the rest of its operational life, and in 2002 Dr Edward Cheung, then principal engineer for the challenging Hubble Space Telescope Development Project, oversaw a technically-difficult service mission to install a repair component that he named Aruba (after the Caribbean island where he grew up).

Dr Cheung was recently awarded Knight of the Royal Order of the Netherlands Lion in recognition of his engineering achievements at Hubble, while on the scientific side, astronomer Adam Riess shared the 2011 Nobel Prize in Physics for his discovery of a new kind of supernovae – an exploding star – via the Hubble telescope.

Data from the Hubble project continues to raise new challenges. On the biomedical front, there are few better examples of the symbiotic relationship between engineering and science than the Institute of Biomedical Engineering (IBME) at Imperial College in London. This was set up by a '10m donation in 2004 to stimulate medical diagnosis and treatment by bringing together engineers, life science researchers, and medical practitioners. It has made many original contributions to biomedical research, recognised by winning the 2009 Times Higher Education award for outstanding contribution to Innovation and Technology.

Since then the institute has continued to focus on three domains – early detection, diagnosis and real time monitoring of health conditions – providing a framework for the emerging era of personalised medicine. This will depend on accurate and often continuous monitoring of health conditions to match therapies to the specific genetic profile and current metabolic condition of the patient. The institute has developed a number of novel methods for such continuous sensing. 

Further reading

http://eandt.theiet.org/magazine/2012/09/small-stuff-big-science.cfm

http://eandt.theiet.org/magazine/2012/04/storm-chasers.cfm

http://eandt.theiet.org/magazine/2011/10/interview-heinz-wolff.cfm

http://eandt.theiet.org/news/2012/sep/engineering-research.cfm

http://eandt.theiet.org/news/2012/jul/scientific-research.cfm

Share |
Related forum discussions
  Topic Replies  
forum comment An Effort to take on too many topics in two days ! 0 Reply

Latest Issue

E&T cover image 1410

"Climate change in Antarctica is leading to interest in extracting the region's natural resources, but there's the small matter of a treaty."

E&T jobs

E&T Marketplace

The essential source of engineering products and suppliers.

E&T podcast

Tune into our latest podcast

iTunes logo

Subscribe

Choose the way you would like to access the latest news and developments in your field.

Subscribe to E&T