-
WikiChip
WikiChip
-
Architectures
Popular x86
-
Intel
- Client
- Server
- Big Cores
- Small Cores
-
AMD
Popular ARM
-
ARM
- Server
- Big
- Little
-
Cavium
-
Samsung
-
-
Chips
Popular Families
-
Ampere
-
Apple
-
Cavium
-
HiSilicon
-
MediaTek
-
NXP
-
Qualcomm
-
Renesas
-
Samsung
-
From WikiChip
Difference between revisions of "bitmain/sophon/bm1680"
Line 11: | Line 11: | ||
|process=28 nm | |process=28 nm | ||
|technology=CMOS | |technology=CMOS | ||
+ | |v core=0.9 V | ||
+ | |v core tolerance=5% | ||
+ | |v io=1.8 V | ||
+ | |v io tolerance=5% | ||
|tdp=41 W | |tdp=41 W | ||
+ | |tdp typical=25 W | ||
|temp min=0 °C | |temp min=0 °C | ||
|temp max=125 °C | |temp max=125 °C |
Revision as of 00:26, 7 November 2017
Template:mpu Sophon BM1680 is a neural processor designed by Bitmain capable of performing both network inference and network training.