A dozen-plus chip engineers recently joined Google’s team on a new site in Bengaluru, and the company wants to add more, according to Reuters, whose reporter has combed through Google employee LinkedIn profiles, the company’s job postings, and gotten two executives working in the industry to confirm the expansion.
Bengaluru has emerged as one of the world’s top semiconductor design hubs over the last two decades, and Google has poached at least 16 engineering veterans from Intel, Qualcomm, Broadcom, and Nvidia to work for its “gChips” team there, the report said.
It’s unclear whether the engineers will design chips for consumer devices, Google data centers, or both. The handful of hyperscale cloud platform operators including Google have all been beefing up their in-house chip design capabilities in recent years, using a lot of that brainpower to develop server processors. Google and Amazon already have their own chips powering portions of their cloud infrastructure.
Having these capabilities in-house achieves two things. As Moore’s law slows down, companies have turned to optimizing processors for specific applications as the primary way to accelerate performance. Even though Intel, the dominant supplier of server chips, consults with its biggest customers as it designs its next part and allows some customization for hyperscalers, its designs still have to deliver value for its broad customer base. An in-house chip design team at Google only has to worry about optimizing for Google’s software.
The second thing it achieves is reduced reliance on Intel or dominant semiconductor suppliers in other spaces, such as Qualcomm in smartphone chips and Nvidia in GPUs for machine learning. Most of the world’s data center muscle, including the hyperscale platforms’, is delivered via Intel’s x86 chips. In this near-monopoly, even the largest customers don’t have as much say as they’d probably like.
One of Google’s new hires in Bengaluru, Rajat Bhargava, said on LinkedIn that he was Google’s “silicon site lead” there. He said he’d worked for 10 years at Broadcom and one year at Intel in the past.
An anonymous “industry executive familiar with Google’s plans” told Reuters that the Google team at the site could grow to 80 people by the end of the year.
Google unveiled the first processor it designed in-house in 2016. Called TPUs, or Tensor Processing Units, these ASICs power deep learning applications in the company’s data centers. Google also rents them out as a cloud service.
Amazon Web Services unveiled a custom chip for machine learning inference in November 2018. The AWS Inferentia chip was designed by engineers from Annapurna, an Israeli chip startup Amazon acquired in 2015. The same week, AWS also unveiled an Annapurna-designed custom chip for less compute-intensive cloud workloads, based on the Arm architecture.