From WikiChip
Difference between revisions of "qualcomm/microarchitectures/cloud ai 100"
(cloud ai) |
|||
Line 7: | Line 7: | ||
|introduction=March, 2021 | |introduction=March, 2021 | ||
|process=7 nm | |process=7 nm | ||
+ | |processing elements=16 | ||
|type=VLIW | |type=VLIW | ||
|decode=4-way | |decode=4-way | ||
+ | |l2=1 MiB | ||
+ | |l2 per=core | ||
+ | |side cache=8 MiB | ||
+ | |side cache per=core | ||
}} | }} | ||
'''Cloud AI 100''' is an [[NPU]] microarchitecture designed by [[Qualcomm]] for the server and edge market. Those NPUs are sold under the {{qualcomm|Cloud AI}} brand. | '''Cloud AI 100''' is an [[NPU]] microarchitecture designed by [[Qualcomm]] for the server and edge market. Those NPUs are sold under the {{qualcomm|Cloud AI}} brand. |
Revision as of 04:56, 15 September 2021
Edit Values | |
Cloud AI 100 µarch | |
General Info | |
Arch Type | NPU |
Designer | Qualcomm |
Manufacturer | TSMC |
Introduction | March, 2021 |
Process | 7 nm |
PE Configs | 16 |
Pipeline | |
Type | VLIW |
Decode | 4-way |
Cache | |
L2 Cache | 1 MiB/core |
Side Cache | 8 MiB/core |
Cloud AI 100 is an NPU microarchitecture designed by Qualcomm for the server and edge market. Those NPUs are sold under the Cloud AI brand.
Facts about "Cloud AI 100 - Microarchitectures - Qualcomm"
codename | Cloud AI 100 + |
designer | Qualcomm + |
first launched | March 2021 + |
full page name | qualcomm/microarchitectures/cloud ai 100 + |
instance of | microarchitecture + |
manufacturer | TSMC + |
name | Cloud AI 100 + |
process | 7 nm (0.007 μm, 7.0e-6 mm) + |
processing element count | 16 + |