From WikiChip
Difference between revisions of "nervana/nnp/nnp-i 1100"
(nnpi 1100) |
|||
Line 1: | Line 1: | ||
{{nervana title|NNP-I 1100}} | {{nervana title|NNP-I 1100}} | ||
− | {{chip}} | + | {{chip |
+ | |name=NNP-I 1100 | ||
+ | |image=spring_hill_package_(front).png | ||
+ | |back image=spring_hill_package_(back).png | ||
+ | |designer=Intel | ||
+ | |manufacturer=Intel | ||
+ | |model number=NNP-I 1100 | ||
+ | |market=Server | ||
+ | |market 2=Edge | ||
+ | |first announced=November 12, 2019 | ||
+ | |first launched=November 12, 2019 | ||
+ | |family=NNP | ||
+ | |series=NNP-I | ||
+ | |microarch=Spring Hill | ||
+ | |process=10 nm | ||
+ | |transistors=8,500,000,000 | ||
+ | |technology=CMOS | ||
+ | |die area=239 mm² | ||
+ | |core count=12 | ||
+ | |tdp=12 W | ||
+ | }} | ||
'''NNP-I 1100''' is an inference [[neural processor]] designed by [[Intel Nervana]] and introduced in late 2019. Fabricated on [[Intel's 10 nm process]] based on the {{intel|Spring Hill|l=arch}} microarchitecture, the NNP-I 1100 has 12 {{intel|Spring Hill#Inference Compute Engine (ICE)|ICEs|l=arch}} for a peak performance of 50 [[TOPS]] at a TDP of 12 W. This chip comes in an [[M.2]] [[accelerator card]] form factor. | '''NNP-I 1100''' is an inference [[neural processor]] designed by [[Intel Nervana]] and introduced in late 2019. Fabricated on [[Intel's 10 nm process]] based on the {{intel|Spring Hill|l=arch}} microarchitecture, the NNP-I 1100 has 12 {{intel|Spring Hill#Inference Compute Engine (ICE)|ICEs|l=arch}} for a peak performance of 50 [[TOPS]] at a TDP of 12 W. This chip comes in an [[M.2]] [[accelerator card]] form factor. |
Revision as of 00:42, 1 February 2020
Edit Values | |
NNP-I 1100 | |
General Info | |
Designer | Intel |
Manufacturer | Intel |
Model Number | NNP-I 1100 |
Market | Server, Edge |
Introduction | November 12, 2019 (announced) November 12, 2019 (launched) |
Shop | Amazon |
General Specs | |
Family | NNP |
Series | NNP-I |
Microarchitecture | |
Microarchitecture | Spring Hill |
Process | 10 nm |
Transistors | 8,500,000,000 |
Technology | CMOS |
Die | 239 mm² |
Cores | 12 |
Electrical | |
TDP | 12 W |
Packaging | |
NNP-I 1100 is an inference neural processor designed by Intel Nervana and introduced in late 2019. Fabricated on Intel's 10 nm process based on the Spring Hill microarchitecture, the NNP-I 1100 has 12 ICEs for a peak performance of 50 TOPS at a TDP of 12 W. This chip comes in an M.2 accelerator card form factor.
Facts about "NNP-I 1100 - Intel Nervana"
back image | + |
core count | 12 + |
designer | Intel + |
die area | 239 mm² (0.37 in², 2.39 cm², 239,000,000 µm²) + |
family | NNP + |
first announced | November 12, 2019 + |
first launched | November 12, 2019 + |
full page name | nervana/nnp/nnp-i 1100 + |
has ecc memory support | true + |
instance of | microprocessor + |
ldate | November 12, 2019 + |
main image | + |
manufacturer | Intel + |
market segment | Server + and Edge + |
max memory bandwidth | 62.585 GiB/s (64,086.914 MiB/s, 67.2 GB/s, 67,200 MB/s, 0.0611 TiB/s, 0.0672 TB/s) + |
microarchitecture | Spring Hill + and Sunny Cove + |
model number | NNP-I 1100 + |
name | NNP-I 1100 + |
peak integer ops (8-bit) | 50,000,000,000,000 OPS (50,000,000,000 KOPS, 50,000,000 MOPS, 50,000 GOPS, 50 TOPS, 0.05 POPS, 5.0e-5 EOPS, 5.0e-8 ZOPS) + |
process | 10 nm (0.01 μm, 1.0e-5 mm) + |
series | NNP-I + |
supported memory type | LPDDR4X-4200 + |
tdp | 12 W (12,000 mW, 0.0161 hp, 0.012 kW) + |
technology | CMOS + |
transistor count | 8,500,000,000 + |