LAS VEGAS, April 23 — Google unveiled new generation tensor processing units (TPUs) for training artificial intelligence and powering digital “agents” that are all the rage in the tech world.

Google, along with Amazon, have taken to making their own cutting-edge AI chips, taking control of designs and seeking to reduce reliance on coveted Nvidia graphics processing units (GPUs) that dominate the market.

High-performance TPUs were among innovations touted at Google’s annual cloud computing conference in Las Vegas on Wednesday.

“In the era of AI agents, infrastructure needs to evolve to take on the most demanding AI workloads,” Google chief executive Sundar Pichai said in a blog post.

“This year, we’re bringing the eighth generation of our Tensor Processing Units with a dual chip approach.”

One of the new TPUs is optimized for training large language models that power AI and the other is tailored for a reasoning and decision-making process called “inference” used by AI agents.

AI agents are digital assistants that can independently tend to computing tasks.

The TPUs, created in partnership with semiconductor maker Broadcom, will be available later this year, according to Google Cloud unit leader Thomas Kurian.

Nvidia early this year announced production of new Vera and Rubin GPUs for powering AI, shortly before cloud computing giant Amazon unveiled the latest generation of its custom Trainium processors.

Google, Amazon, and Microsoft continue to integrate Nvidia GPUs into their computing infrastructures. — AFP