From WikiChip
AWS Inferentia - Annapurna Labs (Amazon)
< annapurna labs
Revision as of 15:37, 28 November 2018 by David (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Edit Values
AWS Inferentia
General Info
DesignerAnnapurna Labs
IntroductionNovember 28, 2018 (announced)
Microarchitecture

AWS Inferentia is a neural processor designed by Annapurna Labs for Amazon's own AWS cloud services. The chip was first announced in late November 2018. The chip is said to provide low latency and high-performance throughput in the 100s of TOPS.


DIL16 Blank.svg Preliminary Data! Information presented in this article deal with a microprocessor or chip that was recently announced or leaked, thus missing information regarding its features and exact specification. Information may be incomplete and can change by final release.
designerAnnapurna Labs +
first announcedNovember 28, 2018 +
full page nameannapurna labs/aws inferentia +
instance ofmicroprocessor +
ldateNovember 28, 2018 +
nameAWS Inferentia +