From WikiChip
Difference between revisions of "neural processor"
(added Synaptics) |
(→List of machine learning processors) |
||
Line 20: | Line 20: | ||
* [[Cambricon]]: {{cambricon|MLU}} | * [[Cambricon]]: {{cambricon|MLU}} | ||
* [[Flex Logix]]: InferX | * [[Flex Logix]]: InferX | ||
+ | * [[Nepes]]: [[NM500]] ([[General Vision]] tech) | ||
* [[GreenWaves]]: {{greenwaves|GAP8}} | * [[GreenWaves]]: {{greenwaves|GAP8}} | ||
* [[Google]]: {{google|TPU}} | * [[Google]]: {{google|TPU}} |
Revision as of 17:49, 8 August 2019
A neural processor or a neural processing unit (NPU) is a microprocessor that specializes in the acceleration of machine learning algorithms, typically by operating on predictive models such as artificial neural networks (ANNs) or random forests (RFs).
NPUs sometimes go by similar names such as a tensor processing unit (TPU), neural network processor (NNP) and intelligence processing unit (IPU) as well as vision processing unit (VPU) and graph processing unit (GPU).
Overview
This section is empty; you can help add the missing info by editing this page. |
List of machine learning processors
- Alibaba: Ali-NPU
- Amazon: AWS Inferentia
- Apple: Neural Engine
- ARM: ML Processor
- Baidu: Kunlun
- Bitmain: Sophon
- Cambricon: MLU
- Flex Logix: InferX
- Nepes: NM500 (General Vision tech)
- GreenWaves: GAP8
- Google: TPU
- Graphcore: IPU
- Groq:
- Hailo, Hailo-8
- Huawei: Ascend
- Intel: NNP, Myriad, EyeQ, GNA
- Kendryte: K210
- NationalChip: Neural Processing Unit (NPU)
- Nvidia: NVDLA, Xavier
- Samsung: Neural Processing Unit (NPU)
- Synaptics: SyNAP (NPU)
- Wave Computing: DPU
This list is incomplete; you can help by expanding it.