China’s Baidu made two big moves that are going to make it a major player in the artificial intelligence (AI) space: an extremely powerful new chip designed to compete with Google’s Tensor Processing Unit (TPU) and a wide-spanning alliance with Intel.
First, the company introduced the Kunlun, a cloud-to-edge range of AI chips built to accommodate high-performance requirements of a wide variety of AI scenarios. The announcement was made at Baidu Create, a developer show that is starting to look an awful lot like Google I/O in terms of content and sessions.
Kunlun leverages Baidu’s AI ecosystem, which includes AI scenarios such as search ranking and deep learning frameworks, including its open source deep learning framework called PaddlePaddle. Kunlun can be used in everything from autonomous vehicles to data centers.
So far, the company has two processors, the 818-300 training chip and the 818-100 inference chip. The company says Kunlun is 30 times faster than the FPGA-based AI accelerators it introduced in 2011. After seven years, you would expect a leap like that. But Baidu also claimed 260 TFlops of performance at a 100 watt draw, which is well above the 45 TFlops of Google’s TPU.
In addition to supporting the common open-source deep learning algorithms, Kunlun can also support a wide variety of AI applications, such as voice recognition, search ranking, natural language processing, autonomous driving and large-scale recommendations.
The most important detail — whether it would be sold to the public or only available on its services like Google does with its TPU — was not disclosed.
Baidu partners with Intel on projects
The company also announced a partnership with Intel on a series of AI projects, including FPGA-backed workload acceleration, a deep learning framework based on Xeon scalable processors. Intel did not identify which FPGA series Baidu would use, but the chip maker recently announced the integration of its Arria family with its mainstream Xeon server chip.
Baudi said it would optimize PaddlePaddle running on Xeon scalable processors, including tweaks for computing, memory, and networking. The partners said they would explore the integration of the deep learning framework with the nGraph deep neural network compiler.
This portion of the deal is a cloud-based partnership, as Baidu said it was looking to develop a “heterogeneous” cloud computing platform based on Intel FPGAs. It already has a similar alliance with Nvidia to customize its PaddlePaddle framework for Volta GPUs and bring AI capabilities to the Chinese consumer market.
Another facet of the alliance is with Israeli software company Mobileye, which happens to be an Intel subsidiary, and Intel’s Movidius image recognition AI chip for Baidu’s Apollo, the company’s autonomous vehicle project. The deal combines Mobileye’s Responsibility Sensitive Safety with Movidius Myriad 2 VPU in the Apollo Pilot.
This is not Intel/Mobileye’s first dance with autonomous vehicles. It already has partnerships with BMW and Fiat Chrysler.
The winner in this is Baidu and, by way of that, China. Baidu has modeled itself on Google to great effect and has an enormous presence in its native country thanks to Google’s exit in 2010. And China isn’t known for being generous with its tech. What are the chances we’ll see the Apollo vehicles in the U.S., or for that matter the Kunlun?
Baidu is rapidly taking form as the chief competitor to Google. It is the only company with the scale and reach. The only question is whether it will make an international push. It tried expanding into Japan a few years back, and that failed, although I suspect the reasons had nothing to do with technology. By and large, Baidu has limited its U.S. reach to recruiting talent for back home.
At the very least, it’s positioning itself as a dominant player in the biggest market in the world. But if it goes global, it’s certainly in position to give Google a run for its money.