From WikiChip
Difference between revisions of "t-head/hanguang 800"
Line 1: | Line 1: | ||
{{t-head title|Hanguang 800}} | {{t-head title|Hanguang 800}} | ||
{{chip | {{chip | ||
− | |name=hanguang800.jpg | + | |name=Hanguang 800 |
+ | |image=hanguang800.jpg | ||
|designer=Alibaba | |designer=Alibaba | ||
|manufacturer=TSMC | |manufacturer=TSMC |
Revision as of 23:21, 25 September 2019
Edit Values | |
Hanguang 800 | |
General Info | |
Designer | Alibaba |
Manufacturer | TSMC |
Market | Server, Artificial Intelligence |
Introduction | September 26, 2019 (announced) September 26, 2019 (launched) |
General Specs | |
Family | Hanguang |
Microarchitecture | |
Process | 12 nm |
Transistors | 17,000,000,000 |
Technology | CMOS |
Hanguang 800 is a neural processor designed by Alibaba for inference acceleration in their data centers. Introduced in September 2019, the Hanguang 800 is fabricated on a 12 nm process. This chip is used exclusively by Alibaba in their Aliyun infestructure.
This article is still a stub and needs your attention. You can help improve this article by editing this page and adding the missing information. |
Facts about "Hanguang 800 - T-Head"
designer | Alibaba + |
family | Hanguang + |
first announced | September 26, 2019 + |
first launched | September 26, 2019 + |
full page name | t-head/hanguang 800 + |
instance of | microprocessor + |
ldate | September 26, 2019 + |
main image | + |
manufacturer | TSMC + |
market segment | Server + and Artificial Intelligence + |
name | Hanguang 800 + |
process | 12 nm (0.012 μm, 1.2e-5 mm) + |
technology | CMOS + |
transistor count | 17,000,000,000 + |