From WikiChip
Difference between revisions of "neural processor"
(→List of machine learning processors) |
(adding ARM ML processor) |
||
Line 15: | Line 15: | ||
* [[Amazon]]: {{amazon|AWS Inferentia}} | * [[Amazon]]: {{amazon|AWS Inferentia}} | ||
* [[Apple]]: Neural Engine | * [[Apple]]: Neural Engine | ||
+ | * [[ARM]]: ML Processor | ||
* [[Baidu]]: {{baidu|Kunlun}} | * [[Baidu]]: {{baidu|Kunlun}} | ||
* [[Bitmain]]: {{bitmain|Sophon}} | * [[Bitmain]]: {{bitmain|Sophon}} |
Revision as of 21:30, 29 May 2019
A neural processor or a neural processing unit (NPU) is a microprocessor that specializes in the acceleration of machine learning algorithms, typically by operating on predictive models such as artificial neural networks (ANNs) or random forests (RFs).
NPUs sometimes go by similar names such as a tensor processing unit (TPU), neural network processor (NNP) and intelligence processing unit (IPU) as well as vision processing unit (VPU) and graph processing unit (GPU).
Overview
This section is empty; you can help add the missing info by editing this page. |
List of machine learning processors
- Alibaba: Ali-NPU
- Amazon: AWS Inferentia
- Apple: Neural Engine
- ARM: ML Processor
- Baidu: Kunlun
- Bitmain: Sophon
- Cambricon: MLU
- Flex Logix: InferX
- Google: TPU
- Graphcore: IPU
- Groq:
- Hailo, Hailo-8
- Huawei: Ascend
- Intel: NNP, Myriad, EyeQ
- Kendryte: K210
- NationalChip: Neural Processing Unit (NPU)
- Nvidia: NVDLA, Xavier
- Samsung: Neural Processing Unit (NPU)
- Wave Computing: DPU
This list is incomplete; you can help by expanding it.