From WikiChip
Difference between revisions of "neural processor"
(→List of machine learning processors) |
(→List of machine learning processors) |
||
Line 21: | Line 21: | ||
|- | |- | ||
| [[Graphcore]] || {{graphcore|IPU}} | | [[Graphcore]] || {{graphcore|IPU}} | ||
+ | |- | ||
+ | | [[Groq]] || | ||
|- | |- | ||
| [[Intel]] || {{nervana|NNP}}, {{movidius|Myriad}}, {{mobileye|EyeQ}} | | [[Intel]] || {{nervana|NNP}}, {{movidius|Myriad}}, {{mobileye|EyeQ}} | ||
Line 26: | Line 28: | ||
| [[Nvidia]] || {{nvidia|NVDLA|l=arch}} | | [[Nvidia]] || {{nvidia|NVDLA|l=arch}} | ||
|- | |- | ||
− | | [[Huawei]] || | + | | [[Huawei]] || Ascend |
|- | |- | ||
| [[Apple]] || Neural Engine | | [[Apple]] || Neural Engine |
Revision as of 12:42, 24 December 2018
A neural processor or a neural processing unit (NPU) is a microprocessor that specializes in the acceleration of machine learning algorithms, typically by operating on predictive models such as artificial neural networks (ANNs) or random forests (RFs).
NPUs sometimes go by similar names such as a tensor processing unit (TPU), neural network processor (NNP) and intelligence processing unit (IPU) as well as vision processing unit (VPU) and graph processing unit (GPU).
List of machine learning processors
Designer | NPU |
---|---|
Amazon | AWS Inferentia |
Alibaba | Ali-NPU |
Baidu | Kunlun |
Bitmain | Sophon |
Cambricon | MLU |
TPU | |
Graphcore | IPU |
Groq | |
Intel | NNP, Myriad, EyeQ |
Nvidia | NVDLA |
Huawei | Ascend |
Apple | Neural Engine |
Samsung | Neural Processing Unit (NPU) |
This list is incomplete; you can help by expanding it.