Oxbotica's Geni driverless car
Exclusive

Driverless car advances accelerated by Selenium's robotic brain

Image credit: Oxbotica

Taking on the behemoths of Silicon Valley is Selenium, a robotic brain that learns on the move, shares its knowledge with other systems, guides driverless cars and can operate anywhere - even on Mars. Louise Murray took the new technology for a spin.

Behind the high-security Culham Science Park in south Oxfordshire, which also houses some of the UK Atomic Energy Authority, is the headquarters of Oxford University Robotics Institute spinoff Oxbotica. I’d driven in appalling motorway conditions the previous evening, contending with rain, snow and multiple accidents, to have my first ride in an autonomous car, wishing for most of the way that my aged Subaru Forester was already fully autonomous.

The form of the Geni is unremarkable, with its cool factor on par with a fairground dodgem or the Smart car, which it resembles. No petrolhead will be eager to park a Geni outside their door. The vehicle is derived from a highly modified electric Renault Twizy, itself classed as a quadricycle rather than a car because of its size, weight and top speed. Yet to focus on the vehicle is to completely miss the point.

This four-wheeled, doors-optional vehicle has been given a makeover at Oxbotica. It is purely a platform for testing some very clever software. Tucked behind the two seats is one of the most innovative robotic brains developed for the driverless car sector. It devours data from many sensors around the car, processes it on the fly, makes decisions about object avoidance and navigation, and shares its experience with other autonomous vehicles using the software.

An earlier incarnation of the visual navigation software was licensed from the Robotics Institute at Oxford University by the European Space Agency and will travel to Mars in 2020 to help the Exomars Rover navigate over the planet’s surface.

Launched in 2016, the Selenium operating system is the result of a huge amount of effort – taking the equivalent of 200 years’ worth of labour – in mobile autonomy. Oxbotica co-founder Professor Paul Newman is an internationally renowned robotic navigation expert, and describes his life’s mission as having machines know where they are, what is around them, and deciding what to do next. In short: to do useful stuff.

“It is one of the hardest challenges I have faced,” says Newman. There is a huge difference between autonomous vehicle design and manufacture, and development of the more important intellectual property – control software. “There are many bodies, but few brains out there,” he says.

Commercial confidentiality prevents the company from discussing which manufacturers use Selenium, but the modular nature of the software design lets clients pick and choose the elements they want, as it is able to plug into external proprietary systems. Clients worldwide are licensing vision and laser navigation components to operate in conjunction with their own software, and the exuberant Newman will only say that 2017 is shaping up to be an extraordinary year.

Dub4, one the newest components of Selenium, was launched as visual localisation software at CES in January 2017. It’s the first navigation operating system to rely entirely on visual cues, with no dependence on GPS, which can fail when underground, indoors or under tree cover, for example. Initially, on a new platform, Dub4 must be ‘shown the road or route’. This could be through a human driver, or another autonomous vehicle using Dub4 can share data with it, such as what Oxford Street looks like. Selenium is not reliant on any prior third-party survey, but Newman says it can use third-party 3D mapping input.

Oxbotica is an independent company, profitable since inception and unallied to any specific brand or manufacturer because the software is platform agnostic. It can be deployed in a robotic picker in a warehouse, in a luxury passenger car, in driverless public transport or in mining trucks above or below ground – its independence from a GPS feed makes it a perfect solution. In the future, Selenium could even operate robotic vehicles in hostile environments like war zones, for nuclear testing or in cometary mining.  

The car has an array of sensors that enable Selenium to make sense of the world around it. At the rear of the vehicle is a 3D scanner and a push-broom 3D laser scanner – a sensor that generates data from being ‘pushed’ through the world by a vehicle. The front has two more 3D scanners and three stereo cameras, which complete the sensor suite. The Selenium operating system in the vehicle uses input from all sensors to model scene structure in a point cloud map in 3D, localising Geni within that map in real time.

The sensors deployed will vary with each situation. In a relatively simple environment, like a warehouse, a robot could purely depend on an inexpensive visual camera system for navigational input, while the full suite would be used on a passenger vehicle in a complex environment among other road users.

Machine learning allows Selenium to learn from its mistakes and share that experience with other vehicles running the same system, improving performance over time. Road safety experts believe this will make autonomous vehicles much safer, as same 90 per cent of road accidents result wholly or in part from human error.

The driver-vehicle interface is a large tablet mounted on the dashboard, which communicates wirelessly with the Selenium brain stored inside an off-the-shelf PC computer platform. There is a sequential handover between human control and autonomy that makes it impossible for driverless operations to begin without full engagement and knowledge of the driver. The touchscreen is pressed and control is handed over to Selenium.

I was completely unprepared for my reaction. The moment when the steering wheel turned by itself was as shocking and unnatural as that scene in ‘The Exorcist’ when the child’s head rotates 180 degrees. It’s something you feel in your stomach. After my initial response, the sensation of being driven by an AI was weird, marvellous and exciting, all at once. The car actually drives differently from a human, accelerating and braking at points that we wouldn’t. It’s not that it is, or even feels, unsafe, but at some level as a passenger, you are aware of the difference.

Transition to full autonomy, with no input required from humans beyond selection of the desired destination, is some way off and may be anything from five to 20 years away according to experts in the field.

Selenium will be out and about in Greenwich, south London, from spring 2017. It will be the brain below the bonnet, navigating shuttle vehicles developed by Westfield Cars and Heathrow Airport, where they are operating to and from car parking facilities.

In a six-month public trial in the Greenwich peninsula run by the GATEway consortium, led by transport research organisation TRL, the shuttles will run on routes shared with pedestrians, cyclists and other drivers near the O2 arena.

TRL director Professor Nick Reed says: “We aim to give members of the public the experience of riding in automated vehicles and exploring their views while gaining experience of the systems and technology needed to operate automated transport in a complex urban environment.”

People living on a local residential estate will be able to call up a pod to transport them to nearby transport links using cloud-based Caesium fleet-management software, also developed by Oxbotica. Caesium enables a smartphone booking system, route optimisation according to passenger demand, and data exchange between vehicles without operator intervention. The system also submits data about the mode of each shuttle (manual or autonomous), remaining battery power, key component temperature, speed, heading and destination.

Reed is confident that automated vehicles will revolutionise transportation of people and goods and predicts benefits to congestion levels, safety and pollution.

If you can’t get down to Greenwich to participate in the trials, Selenium will also be on show in Milton Keynes in the Lutz Pathfinder self-driving pod project run by UK Autodrive, operating along pedestrianised routes near the railway station this spring.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close