The next generation of electronic materials will use nanoscale structures to improve their properties. It calls for new understanding of quantum effects.
'What's the difference between science and engineering? If a scientist gets a result once, there's a paper in it; if an engineer gets a result once, they haven't done enough testing.'
There is an element of truth in this joke and it is especially resonant for nanotechnology and its practical application to electronics. Science has generated countless headlines about nano-materials with little follow-up as to how these might fare in the real world. But that's beginning to change.
Since the 1930s, we have known what materials get up to at the atomic level thanks to Erwin Schrdinger, whose famous equation describes the behaviour of electrons in atoms with a wave function that exists over all space.
'Whether a material is a metal or a superconductor or a magnet or ultra-hard or jelly depends on how more or less successfully those electrons cosy up to the ionic cores or how they are shared in bonds between the atoms,' says Nicholas Harrison, head of the computational materials science group at Imperial College London, and at STFC Rutherford Appleton and Daresbury Laboratories.
Although superficially easy to estimate, the tricky part is working out the detailed interactions between electrons and nuclei. Modelling materials at the quantum mechanical level is computationally voracious. However, a combination of faster, parallel computing engines and the cleverness of simulation algorithm designers is such that the behaviour of a thousand or so independent atoms can now be tackled in a detailed way.
The aim of this work is software that can do a quantum mechanical simulation on any material. 'We can input the details of the different atoms of interest, push the 'go' button and the software will pour in the appropriate number of electrons and figure out how they would like to organise themselves in terms of electron density, the position of various orbitals and so on,' explains Harrison. From this, the software works out what the forces on the atoms would be. 'Because you're using the fundamental rules of the game, you can create a material and push 'go' with every confidence that you are accurately calculating its properties, structure and vibrations,' he adds.
In practical engineering terms, the researchers have developed the basic techniques that will be necessary for understanding in advance the mechanisms of how crystal layers grow. For example, titanium oxide will adopt different structures with a range of optical properties depending on the proportions of atoms used and the energy at which they arrive at the glass surface something that can now be controlled in a repeatable way.
What's become evident from the simulations is that materials' preferred natural bond lengths have a significant influence on how multilayer coatings behave something that chip designers have found with strained silicon, which uses multiple layers of different mixtures of semiconductor materials to artificially stress the crystal lattice.
'If you put zinc oxide above titanium oxide, one of them is going to be miserable because the preferred titanium-oxygen bond length is different from the zinc-oxygen bond length so they don't line up perfectly,' explains Harrison. 'What tends to happen is that one of the layers starts to develop dislocations or cracks, and it's a bit of a war about who develops cracks and who stays smooth. In turn, the cracks affect how light interacts with the coatings.'
Clearly, there is a level of detail appropriate for each problem. 'Once you know what your layers consist of and how they intermingle and what kind of defects you have, you can drop down into quantum mechanics to see how those layers adhere to each other and how the materials absorb and reflect light. Then you can step up in the length scales and say if I jab something sharp into the surface which layers would separate first,' explains Harrison.
Philips Applied Technologies is leading a similar collaborative effort to develop a multi-scale modelling approach to understand the behaviour of metal oxide polymer interface materials in system-in-package (SiP) products. SiPs contain many components interfaced with very thin layers of metal oxide polymers, whose job is to stick the whole lot together while also handling high thermal stresses and environmental hazards like the ingress of water. Increasing component miniaturisation is in turn increasing the influence of the atomistic structure of the materials. As a result, product behaviour is becoming strongly dependent on material behaviour at this atomistic scale.
The goal of the NanoInterface consortium is to try to predict with simulation where failures might occur and then make design or material changes in advance to improve reliability, according to Alexandru Opran, manager of the project at Philips Applied Technologies.
The NanoInterface partners are particularly keen to understand the processes that lead to de-lamination between the moulding compound and the lead-frame. Automotive customers, for instance, have become fussy in this respect.
'This de-lamination is caused at the molecular level but at this stage all we know is that if we clean the lead frame, we have fewer problems, but we don't understand why,' explains Willem Van Driel, principal engineer at NXP Semiconductors, one of the partners in the NanoInterface project.
Molecular dynamic simulation is just the starting point, modelling the material as atomic billiard balls with interactions described by potential functions. For some materials, such as simple metals, those potentials are already well known, but for polymers theyare yet to be determined. Models at increasing scales provide more insight into gross behaviour. Medium-scale models look at the influence of roughness and fillerparticles. Finally, 3D macro-scale models incorporate information from the smaller scales in order to predict de-lamination behaviour.
A major challenge in the project is the huge difference in time scales. 'If you look to molecular dynamic simulations, they are limited to picoseconds, whereas if you look to polymer materials, they need to have typical time scales in seconds, minutes or even hours in order to capture their behaviour accurately,' explains Opran.
A typical simulation goal would be to start with a reference situation say, the beginning of a failure at the molecular level and watch to see how it develops over time or with an increase in temperature. 'For instance, what happens if you get water in a package, if the temperature rises from 99C to 100C, say, in a few seconds,' says Opran.
A complex SiP design can involve up to five package re-spins, each involving three or so months of testing. Fewer or none would be preferable. 'A predictive tool will help us tremendously. We can use it to select our materials better and make more complex designs,' says van Driel.
Quantamode, led by Marc Bescond at Institut de Materiaux et Microelectronique Nanoscience de Provence (IM2NP) with STMicroelectronics as the industrial partner, is another European research effort on multi-scale modelling of nano effects. This one aims to develop a new generation of quantum transport simulation tools tackling the atomic scale issues that are becoming important in nano-scale semiconductor devices. One part of this work, says Bescond, is about modelling and understanding the interactions of electrons with impurities in the silicon and surface roughness and their impact on electron transport. Another is designing new types of nano-scale transistor architectures, based on findings from these simulations, to improve device performance by take advantaging of the quantum behaviour of the transport. One recent result is a new design for a nanowire MOSFET. The design results in substantial reduction of the tunnelling transmission in the off-state without degrading the on-state. One study has shown that the indentation improves the on-off current ratio by 32 per cent.
Intermediate-band solar cells
A common factor in these three projects is their focus on modelling widely used materials at the nano scale. Although this might seem unadventurous, it is arguably the critical step before studying whether more exotic systems will behave in a reliable, repeatable and reproducible manner in the manufacturing conditions and processes needed for real products.
One candidate is the intermediate-band solar cell. These cells are based on materials designed to provide two chances to turn light into electrical energy by first exciting electrons from their normal state to an intermediate band and then from the intermediate band to the point they break free. Ideally you would want to make such a material by some kind of self-assembly mechanism. Harrison and his colleagues are looking at an arrangement of multiple layers of quantum dots made out of one kind of semiconductor and capped with another as a likely solution.
'This work combines qualitative understanding of energy levels at a very broad scale with nitty-gritty quantum mechanical calculations of what the wave functions would be like,' he says. 'If we could make it, we think we could get 50 to 60 per cent efficiency out of a solar cell, which would break all records.'
Meanwhile, Steve Bull's group in Newcastle is interested in applying multi-scale analysis to understanding the visco-elastic and time-dependent mechanical properties of biomaterials in order to predict how they might behave as interfaces to electronic devices such as sensors.
'If you want to know whether a sensor is going to survive a certain amount of activity, you need to know the mechanics of everything the sensor is sitting in. The materials are totally different but we can use the same analysis technique as with the glass layers to understand the behaviour of any multi-layered coated assembly,whether the layers are organic or metals or dielectrics,' he says.
Commercial design tools that can analyse these multi-scale effects for all kinds of materials seem likely to emerge within the next three to four years judging by the activities of Accelrys. Chief scientific officer Frank Brown says that the interest in computational materials analysis is similar to what happened in pharmaceuticals 15 years ago: 'The trend is towards doing many combinatorial experiments, a kind of recipe tweaking, going through lots of samples and trials. Rationalising all that information is difficult, though, so we see informatics becoming increasingly important in same way as it is in studies for the human genome.'