-
WikiChip
WikiChip
-
Architectures
Popular x86
-
Intel
- Client
- Server
- Big Cores
- Small Cores
-
AMD
Popular ARM
-
ARM
- Server
- Big
- Little
-
Cavium
-
Samsung
-
-
Chips
Popular Families
-
Ampere
-
Apple
-
Cavium
-
HiSilicon
-
MediaTek
-
NXP
-
Qualcomm
-
Renesas
-
Samsung
-
From WikiChip
Difference between revisions of "nervana/nnp/nnp-i 1300"
(nnpi 1300) |
(No difference)
|
Revision as of 00:18, 1 February 2020
Edit Values | |
General Info | |
Microarchitecture |
NNP-I 1300 is an inference neural processor designed by Intel Nervana and introduced in late 2019. Fabricated on Intel's 10 nm process based on the Spring Hill microarchitecture, the NNP-I 1300 comes in a PCIe Gen 3.0 accelerator card form factor with two NPU chips, each with all 24 ICEs enabled for a peak performance of 170 TOPS at a TDP of 75 W.
Facts about "NNP-I 1300 - Intel Nervana"
full page name | nervana/nnp/nnp-i 1300 + |
instance of | microprocessor + |
ldate | 1900 + |