Parallel worlds

Multicore development can't wait for new languages so developers are stepping in with a self-help guide to keeping deadlocks at bay.

Everybody knows what the problem is. In the semiconductor business, the most popular programming language is C/C++. However, since multi-core processing architectures moved into the silicon mainstream, it has quickly become apparent that C/C++ does not lend itself well to writing software that fully exploits - or even reflects - the inherent parallelism of today's chips. There has long been a 'software gap', but this particular trend threatens to widen it at a greater rate than before. Computing faces a new set of performance bottlenecks as systems demand software that can be efficiently multi-threaded. 

Meanwhile, how do you verify code in a language that you know is far from being optimal for the environment in which it is going to be run? Finally, in the increasingly important embedded software arena, designers want to know how to exploit typically heterogeneous system architectures without having to fundamentally change the programming model. These problems, as well as quite a few others, are already giving engineers headaches - this is not their horizon, it is where they stand. 

"So, everybody is waiting for this great new parallel, multi-core, many-core programming language to come along and take all the pain away, but no one can say exactly how long that's going to take," explains David Stewart, CEO of Edinburgh-based co-processor and electronics-design specialist CriticalBlue. "Well, you can't just say, 'Okay, we'll stop and we'll wait for that.' Work carries on - it has to. Talking about C being the 'wrong language' is one thing, but it won't get any products designed. So, people are out there doing what they can with the tools and the languages that are available," Stewart insists.

Given that a make-do-and-mend environment is self-evidently the norm, Stewart says one further and important conclusion could then be drawn. "This all implies that companies are building up a base of people who are capable of writing parallel software by hand, and a set of best practices and methodologies that make use of what's there today," he says. "What we need to do is make that knowledge as widely available as possible." With this aim in mind, Stewart approached the Multicore Association (MA). At the time, he could not be sure whether the recently established technical alliance would be interested - its main focus to date had been on APIs in areas such as resource management and communication. 

In another respect, it had been looking at the bedrocks for standards, and that was not what Stewart initially had in mind (although standards could follow). Then, even if the association was interested, what about its membership? The kind of best-practice discussion Stewart wanted to stimulate might well have floundered if the 800lb gorillas - and the association does have Intel, Texas Instruments and Freescale Semiconductor in its ranks - decided that this kind of information represented competitive advantage, and therefore was not for sharing. "In the end, though, we found that everybody was very open to the idea," Stewart claims. "I think the important points here were that we'd identified a problem that everybody faced and we were looking to tackle it on a basis that everyone was comfortable with." The MA's resulting Multicore Programming Practices (MPP) working group is therefore going forward with what Stewart himself describes as "a lowly goal".

The official blurb calls for: "The creation of MPP, a best-practices guide to the writing of C/C++ embedded software, such that it may be more easily compiled across a range of multicore processor platforms. "MPP will be an open document, possibly a book or booklet, created by a working group operating under the Multicore Association standards body, and constructed in layers such that initial works may be delivered quickly while being further refined. The document could also form the basis of future association standards."

However, Stewart himself couches the project's direction at launch in more immediate but, at the same time, much looser terms. "Initially, we just want to capture the current best practices, the top 10 or so things you can do that will prevent mistakes," he says. "It really isn't that much but if we can get together around these things, it's potentially a huge win for the industry." Indeed, Stewart is personally careful not to bandy around the 'standards' word as even a longer-term objective with the same kind of force that it might have in the mission statement. He has good reason not to. So much was evident from an MA session dedicated to the new working group at this year's Design Automation Conference. Warnings were sounded about taking an approach that could conflict with the various technology and software approval processes demanded by different industries, in particular the military and aerospace markets. The resulting face-off could, some attendees warned, kill the MPP project.

There have also, and obviously, been questions over what kind of design bias the MPP should adopt first, especially if it wants to get information out to designers relatively quickly. For example, Stewart is joined as co-chair of the working group by Max Domeika, a senior staff software engineer at Intel, and he makes no bones about his company's preference for the group to concentrate initially on challenges for embedded software programming. Having set out to maintain the best possible balance to its best practice activities, the MPP was able to attract 15 MA members and four 'guests' to a kick-off meeting in Austin, Texas, in early June. "And it wasn't that difficult to get people to agree about the top five issues," Stewart says. "There is a lot of commonality there, although yes there are some significant areas of diversity." Another sign that the MPP is getting traction comes in the form of technology donations. CriticalBlue itself has provided "a framework of methodology considerations and examples of commonly observed programming issues together with their solutions, with performance analysis where appropriate."

Similar packages have also come from Intel, Nokia Siemens Networks and Open Virtual Platforms. The last of those three is a vehicle for open source technologies from Imperas, one of the company's that hopes to ultimately be a key player in the post-C/C++ world. What happens now is that the working group will continue to convene on a regular basis with the objective of getting its first MPP release out by the time of the Multicore Expo, due to be held in Santa Clara, California next March. The call is also going out that any other company that may want to contribute or participate is still welcome to get on board. "And what we're looking to have by the Expo, even though it may well not be definitive, is something where you can say, 'This is the guide you should read before you touch the keyboard,'" says Stewart. "Simple as that."

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close