Ian King (Bloomberg) -- Groq Inc., a semiconductor startup, said it raised $300 million in a round led by Tiger Global Management and D1 Capital.
The investment brings Groq’s total funding to $367 million and values the Mountain View, California-based company at more than $1 billion. The latest financing included other investors such as The Spruce House Partnership and Addition LP, the venture capital firm founded by former Tiger Global partner Lee Fixel.
Groq will use the proceeds to hire more employees and speed up the development of new products. Chief Executive Officer Jonathan Ross, a former Google chip executive, said the startup now has enough money to fund the business until it generates positive free cash flow.
“We’re growing so we hiring a lot of talent,” he said. “Talent has been flowing away from semiconductors in the last 20 years. That’s flipping.” The company has 122 employees and wants to double that number. Ross is wooing software engineers from some of the largest tech companies because they want to work on hardware, too.
Groq is trying to break into the rapidly growing market for new semiconductors that run artificial intelligence software, competing against incumbents such as Intel Corp., Advanced Micro Devices Inc. and Nvidia Corp., and startups including Ampere Computing and Cerebras Systems.
Ross said his approach is unconventional and more efficient than the AI chips made by Nvidia and rivals. An electric vehicle maker and a financial-services company are in the initial stages of deploying Groq chips, he added, while declining to identify the customers.
Groq and its larger competitors are trying to appeal to major cloud companies such as Amazon.com Inc. and Google that are seeking better ways to analyze the flood information their data centers suck in from smartphone users, online shoppers and internet searches. These companies are among the largest purchasers of chips but are also designing their own chips for AI, intensifying competition further.
Ross started Google’s most-successful effort in this area: the Tensor Processing Unit, or TPU, which powers a rising volume of the internet giant’s AI workloads.
Rather than seeking to improve on existing designs, Groq started with the software then created a programmable chip that’s relatively simple but fast. It has a giant amount of memory to store information, and rows of circuits that manipulate the required numbers and not a lot else.
An accompanying compiler -- software that turns computer programs into instructions that the chip can execute -- is the secret sauce. It allows multiple machine-learning models to be loaded and run in a way that yields instant results, Ross said.
Groq’s chips are primarily designed for the inference market, the type of chips that use software trained by other chips to deal with information and make decisions. Inference chips are critical for things like self-driving vehicles and automated speech recognition.