-
WikiChip
WikiChip
-
Architectures
Popular x86
-
Intel
- Client
- Server
- Big Cores
- Small Cores
-
AMD
Popular ARM
-
ARM
- Server
- Big
- Little
-
Cavium
-
Samsung
-
-
Chips
Popular Families
-
Ampere
-
Apple
-
Cavium
-
HiSilicon
-
MediaTek
-
NXP
-
Qualcomm
-
Renesas
-
Samsung
-
From WikiChip
Difference between revisions of "bitmain/sophon/bm1680"
Line 2: | Line 2: | ||
{{mpu | {{mpu | ||
|name=Sophon BM1680 | |name=Sophon BM1680 | ||
− | | | + | |image=bitmain sophon bm1680.png |
|designer=Bitmain | |designer=Bitmain | ||
|manufacturer=TSMC | |manufacturer=TSMC | ||
Line 8: | Line 8: | ||
|market=Artificial Intelligence | |market=Artificial Intelligence | ||
|first announced=October 25, 2017 | |first announced=October 25, 2017 | ||
+ | |first launched=November 8, 2017 | ||
|family=Sophon | |family=Sophon | ||
|process=28 nm | |process=28 nm |
Revision as of 22:33, 8 November 2017
Template:mpu Sophon BM1680 is a neural processor designed by Bitmain capable of performing both network inference and network training.