From WikiChip
Difference between revisions of "annapurna labs/aws inferentia"
(aws inferentia) |
|||
Line 1: | Line 1: | ||
− | {{annapurna | + | {{annapurna title|AWS Inferentia}} |
{{chip | {{chip | ||
|name=AWS Inferentia | |name=AWS Inferentia |
Latest revision as of 15:37, 28 November 2018
Edit Values | |
AWS Inferentia | |
General Info | |
Designer | Annapurna Labs |
Introduction | November 28, 2018 (announced) |
Microarchitecture |
AWS Inferentia is a neural processor designed by Annapurna Labs for Amazon's own AWS cloud services. The chip was first announced in late November 2018. The chip is said to provide low latency and high-performance throughput in the 100s of TOPS.
Facts about "AWS Inferentia - Annapurna Labs (Amazon)"
designer | Annapurna Labs + |
first announced | November 28, 2018 + |
full page name | annapurna labs/aws inferentia + |
instance of | microprocessor + |
ldate | November 28, 2018 + |
name | AWS Inferentia + |