BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

This New AI Chip Could Help Google, Facebook 'See' Videos

This article is more than 6 years old.

Image via Graphcore

I’ve just clicked on a tutorial video on YouTube about puppy-training, but there’s nary an ad about dogs or even pet care.

Instead, YouTube cues up a video ad for dishwashing tablets, before popping up a banner ad for a mobile game I’ll probably never play.

Google has struggled to make its video ads on YouTube relevant to what people are watching. Facebook has the same issue with the thousands of clips of videos uploaded from people's phones each day. Part of the problem is their algorithms can’t see the images flashing across each video, to help it target human eyeballs more effectively.

That would need some extremely heavy computations, in real time and at a massive scale.

Such is one of the more mundane pursuits for artificial intelligence: targeting you more effectively with video ads. And to do this, Google and Facebook need more powerful chips to process all the visual data uploaded each day to their platforms.

Today they, along with Microsoft, are evaluating the latest AI-focused chip produced by U.K. startup Graphcore, two sources close to the company say.

On Monday Graphcore announced it had raised $50 million from Sequoia Capital to produce cutting edge processors to power the next generation of AI software - digital assistants that will understand you better, and power the ad-targeting software on video platforms like YouTube.

Graphcore founder Nigel Toon said the new money would go towards expanding his commercial team to support his early customers. “We have 75 people in the company and we’ll double that in the next two years,” he said.

The company will ship its AI-focused chip to its first customer in early 2018, he added; Toon has worked on the IPU project for four years. The chips are designed to power the servers processing masses of data in the cloud for some of the world’s biggest Internet companies.

It’ll take longer for AI chips like Graphcore’s to be installed in physical devices like smartphones or smart speakers, Toon said, because so many Internet services are processed in the cloud anyway.

“The cloud is the initial area of really strong interest,” he added.

So far, some of the best chips on the market for machine learning have been the GPUs sold by Nvidia. Graphcore, based in the English city of Bristol, claims that its chip is up to 100 times more powerful than current hardware.

Google also has its own AI-focused chip called the Tensor Processing Unit or TPU, which works in tandem with Google’s TensorFlow framework for building machine learning software.

But Toon says Graphcore’s IPU chip is faster and more flexible than Google’s. “The IPU is a completely new architecture,” Toon says.

Toon wouldn’t disclose the names of any of his early customers, but he did say that early applications of his chip would be “to understand the context and recognize images from video uploads.”

“Social media sites need to stay on top of the way people are communicating,” he added. “It’s not just about text but images. What people are trying to think about is how to do video… If they don’t know what you’re doing on the social media site, how do they monetize the social media service?”

Graphcore’s IPU is a new chip specially designed to power machine-learning algorithms. Internet giants like Amazon and Google are increasingly using machine learning to create intelligent services like Siri or Alexa, while AI-focused chips will also be crucial to self-driving cars who can't rely solely on connectivity to the cloud for processing power.

The name “IPU” represents an evolution in nomenclature. First there were the standard chips that powered computers called central processing units (or CPU), which ran things like office applications. Then came graphics processing unit (or GPUs) that rendered images and video.

Nvidia won the battle to dominate the market for GPUs. While its chips were originally designed for graphics processing and found a lucrative niche in the gaming market, they’ve more recently been used for deep learning thanks to their thousands of “cores” or tiny computers that work in parallel.

Much of the same math used to render pixels to show transparency or light, could also be used to run deep-learning algorithms, hence why Nvidia has recently found itself at the centre of machine learning developments — and seen it stock rise 10-fold in just the last two years.

But Nvidia’s GPUs weren’t designed for deep learning, and a new crop of startups like Graphcore, along with Wave Computing in California, are claiming a better chip designed for AI.

“It’s got the flexibility and the higher level of compute,” says Toon of his IPU chip. “We’ve looked at not just what people are trying to do today but also the innovators and researchers to understand what they like to do going forward, and making sure our processor can support that. We talk about allowing innovators to create the next breakthrough.”