WikiChip Fuse
Groq Tensor Streaming Processor Delivers 1 PetaOPS of Compute

AI startup Groq makes an initial disclosure of their Tensor Streaming Processor (TSP); a single chip capable of 1 petaOPS or 250 teraFLOPS of compute.

Read more
Intel Announces Keem Bay: 3rd Generation Movidius VPU

Intel announces Keem Bay, its 3rd-generation Movidius VPU edge inference processor.

Read more
Intel Spring Hill: Morphing Ice Lake SoC Into A Power-Efficient Data Center Inference Accelerator

First detailed at Hot Chips 31, Intel Spring Hill morphs the Ice Lake SoC into a highly power-efficient data center inference accelerator.

Read more
Analog AI Startup Mythic To Compute And Scale In Flash

A look at the IPU architecture of analog AI startup Mythic which attempts to significantly reduce the power consumption by computing directly in analog in flash.

Read more
Alibaba Launches DC Inference Accelerators

Alibaba launches its own homegrown inference accelerator for their own cloud.

Read more
Inside Tesla’s Neural Processor In The FSD Chip

A deep dive into the custom-designed Tesla neural processing units integrated inside the company’s full self-driving (FSD) chip based on the Tesla Hot Chips 31 talk.

Read more
Nvidia Inference Research Chip Scales to Dozens of Chiplets

Nvidia recently presented a research chip comprising dozens of chiplets that enables them to scale from milliwatts to hundreds of watts in order to cater to different markets such as edge, mobile, automotive, and data center.

Read more
Intel’s Spring Crest NNP-L Initial Details

An initial look into Intel’s upcoming Nervana Neural Network Processor (NNP) accelerators.

Read more
Hot Chips 30: Nvidia Xavier SoC

An overview of the Xavier SoC which was detailed by Nvidia at Hot Chips 30.

Read more
Cambricon Reaches for the Cloud With a Custom AI Accelerator, Talks 7nm IPs

Cambricon has announced their first high-performance and high-power AI accelerator for the data center in an effort to gain market share in the growing Chinese AI market.

Read more