From WikiChip
Difference between revisions of "neural processor"
(→List of machine learning processors) |
(→List of machine learning processors) |
||
Line 33: | Line 33: | ||
* [[Samsung]]: Neural Processing Unit (NPU) | * [[Samsung]]: Neural Processing Unit (NPU) | ||
* [[Synaptics]]: SyNAP (NPU) | * [[Synaptics]]: SyNAP (NPU) | ||
+ | * [[Tesla (car company)|Tesla]]: {{teslacar|FSD Chip}} | ||
* [[Wave Computing]]: DPU | * [[Wave Computing]]: DPU | ||
}} | }} |
Revision as of 18:00, 21 September 2019
A neural processor or a neural processing unit (NPU) is a microprocessor that specializes in the acceleration of machine learning algorithms, typically by operating on predictive models such as artificial neural networks (ANNs) or random forests (RFs).
NPUs sometimes go by similar names such as a tensor processing unit (TPU), neural network processor (NNP) and intelligence processing unit (IPU) as well as vision processing unit (VPU) and graph processing unit (GPU).
Overview
![]() |
This section is empty; you can help add the missing info by editing this page. |
List of machine learning processors
- Alibaba: Ali-NPU
- Amazon: AWS Inferentia
- Apple: Neural Engine
- ARM: ML Processor
- Baidu: Kunlun
- Bitmain: Sophon
- Cambricon: MLU
- Flex Logix: InferX
- Nepes: NM500 (General Vision tech)
- GreenWaves: GAP8
- Google: TPU
- Graphcore: IPU
- Groq:
- Hailo, Hailo-8
- Huawei: Ascend
- Intel: NNP, Myriad, EyeQ, GNA
- Kendryte: K210
- NationalChip: Neural Processing Unit (NPU)
- Nvidia: NVDLA, Xavier
- Samsung: Neural Processing Unit (NPU)
- Synaptics: SyNAP (NPU)
- Tesla: FSD Chip
- Wave Computing: DPU
This list is incomplete; you can help by expanding it.