DGX

Reddit to be language teacher to OpenAI's latest venture

OpenAI – the non-profit artificial intelligence (AI) research company backed by billionaire Elon Musk – recently became the first customer for Nvidia’s ‘AI supercomputer in a box’, known as the DGX-1.

Nvidia’s CEO Jen-Hsun Huang presented the DGX-1 to OpenAI and, according to MIT Technology Review, the researchers already have an idea for a project – to teach the AI to learn and chat through Reddit forums.

MIT says the DGX-1 is optimised for the machine learning known as deep learning; this includes feeding data to a hefty network of crudely simulated neurons. In the past few years, this has meant great leaps in AI. OpenAI’s DGX-1 will allow researchers to teach deep-learning systems quicker by using more data. Andrej Karpathy, a researcher at the company, said: “Deep learning is a very special class of models because as you scale [them] up, they always work better.”

This high-performance system is more useful to researchers than using more weaker and widely distributed GPU cores, as software doesn’t scale well in parallel according to Tom’s Hardware. The DGX-1 contains parallel architecture, which is perfect for OpenAI’s deep-learning algorithms.

The DGX-1 will be used by OpenAI – created to carefully develop open-source-friendly AI so it would benefit society, rather than destroy us in future – to read the gargantuan number of Reddit threads and comments in months, a significant improvement to what could take years for other deep-learning machines. Kaparthy said: “You can take a large amount of data that would help people talk to each other on the internet, and you can train, basically, a chatbot, but you can do it in a way that the computer learns how language works and how people interact.”

OpenAI scientist Ilya Sutskever said that the company won’t have to do much regarding language learning and image recognition, as “we won't need to write any new code, we'll take our existing code and we'll just increase the size of the model.

“And we'll get much better results than we have right now.”

Nvidia has priced the DGX-1 at $129,000 and contains eight NVIDIA Tesla P100 GPUs, 7TB of SSD storage and two Xeon processors. It can process data at a peak of 170 teraflops, which is about the same as 250 servers and cost around $2bn to develop.

The technology company is known for designing graphics processing units (GPUs) for the gaming market, and system-on-a-chip units for the mobile computing and automotive market. They unveiled their new range of mobile GPUs based on Pascal architecture today. According to Eurogamer, the results are quite extraordinary and the units are a huge leap in performance compared to their predecessors. 

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close