-
WikiChip
WikiChip
-
Architectures
Popular x86
-
Intel
- Client
- Server
- Big Cores
- Small Cores
-
AMD
Popular ARM
-
ARM
- Server
- Big
- Little
-
Cavium
-
Samsung
-
-
Chips
Popular Families
-
Ampere
-
Apple
-
Cavium
-
HiSilicon
-
MediaTek
-
NXP
-
Qualcomm
-
Renesas
-
Samsung
-
From WikiChip
AWS Inferentia - Annapurna Labs (Amazon)
Edit Values | |
AWS Inferentia | |
General Info | |
Designer | Annapurna Labs |
Introduction | November 28, 2018 (announced) |
Microarchitecture |
AWS Inferentia is a neural processor designed by Annapurna Labs for Amazon's own AWS cloud services. The chip was first announced in late November 2018. The chip is said to provide low latency and high-performance throughput in the 100s of TOPS.
Retrieved from "https://en.wikichip.org/w/index.php?title=annapurna_labs/aws_inferentia&oldid=84499"
Facts about "AWS Inferentia - Annapurna Labs (Amazon)"
designer | Annapurna Labs + |
first announced | November 28, 2018 + |
full page name | annapurna labs/aws inferentia + |
instance of | microprocessor + |
ldate | November 28, 2018 + |
name | AWS Inferentia + |