From WikiChip
El Capitan (ATS-4) - Supercomputers
| Edit Values | |
| El Capitan | |
| General Info | |
| Sponsors | U.S. Department of Energy |
| Designers | AMD, Cray |
| Operators | Lawrence Livermore National Laboratory |
| Introduction | 2024 |
| Peak FLOPS | 2,746 PFLOPS |
| Succession | |
El Capitan (ATS-4) is the successor to Sierra, a planned exascale supercomputer by the DoE Lawrence Livermore National Laboratory (NNSA/LLNL) for 2022-2024 timeframe. El Capitan is expected to be succeeded by ATS-6 in 2027-2028 timeframe.
Contents
Overview[edit]
- El Capitan (2024) - HPE Cray EX255a ° 2025/6 (1!) ^ (TOP1!)
- • AMD 4th Gen EPYC 24C @1.8GHz,
- • AMD Instinct MI300A, Slingshot-11, TOSS
- • Cores • Rmax (TFlop/s) • Rpeak (TFlop/s) • Power (kW)
- • 11,039,616 • 1,742,000 • 2,746,376 • 29,581 kW
Models[edit]
- National Nuclear Security Administration (NNSA)
- Lawrence Livermore National Laboratory (LLNL)
- Los Alamos National Laboratory (LANL)
|
|
|
TOP500[edit]
- Lawrence Livermore National Laboratory (NNSA/LLNL)
DOE/NNSA/LLNL URL: http://www.llnl.gov/ Segment Research City: Livermore Country/Region: United States System Year Vendor • Cores • Rmax (GFlop/s) • Rpeak (GFlop/s) *El Capitan - HPE Cray EX255a, AMD 4th Gen EPYC 24C 1.8 GHz, AMD Instinct MI300A, Slingshot-11, TOSS :2024 HPE • 11,039,616 • 1,742,000,000 • 2,746,376,090 *El Capitan Early Delivery - HPE Cray EX255a, AMD 4th Gen EPYC 24C 1.8 GHz, AMD Instinct MI300A, Slingshot-11 :2024 HPE • 129,024 • 19,650,000 • 32,097,894 *Tuolumne - HPE Cray EX255a, AMD 4th Gen EPYC 24C 1.8 GHz, AMD Instinct MI300A, Slingshot-11, TOSS :2024 HPE • 1,161,216 • 208,100,000 • 288,881,050 *rzAdams - HPE Cray EX255a, AMD 4th Gen EPYC 24C 1.8 GHz, AMD Instinct MI300A, Slingshot-11, TOSS :2024 HPE • 129,024 • 24,380,000 • 32,097,890 *rzVernal - HPE Cray EX235a, AMD Optimized 3rd Generation EPYC 64C 2.0 GHz, AMD Instinct MI250X, Slingshot-11 :2022 HPE • 35,872 • 5,401,000 • 6,917,820 *Tioga - HPE Cray EX235a, AMD Optimized 3rd Generation EPYC 64C 2.0 GHz, AMD Instinct MI250X, Slingshot-11 :2022 HPE • 30,208 • 4,548,210 • 5,825,540 *Tenaya - HPE Cray EX235a, AMD Optimized 3rd Generation EPYC 64C 2GHz, AMD Instinct MI250X, Slingshot-11 :2022 HPE • 22,656 • 3,411,160 • 4,369,150 *Sierra - IBM Power System AC922, IBM POWER9 22C 3.1 GHz, NVIDIA Volta GV100, Dual-rail Mellanox EDR Infiniband :2018 IBM / NVIDIA / Mellanox • 1,572,480 • 94,640,000 • 125,712,000 *Lassen - IBM Power System AC922, IBM POWER9 22C 3.1 GHz, Dual-rail Mellanox EDR Infiniband, NVIDIA Tesla V100 :2018 IBM / NVIDIA / Mellanox • 288,288 • 18,200,000 • 23,047,200 *Ansel - IBM Power System AC922, IBM POWER9 22C 3.1 GHz, Dual-rail Mellanox EDR Infiniband, NVIDIA Tesla V100 :2018 IBM / NVIDIA / Mellanox • 19,656 • 1,289,000 • 1,686,571 *Vulcan - BlueGene/Q, Power BQC 16C 1.6 GHz, Custom Interconnect :2012 IBM • 393,216 • 4,293,306 • 5,033,165 *Sequoia - BlueGene/Q, Power BQC 16C 1.6 GHz, Custom :2011 IBM • 1,572,864 • 17,173,224 • 20,132,659 DOE/NNSA/LANL/SNL URL: http://www.lanl.gov/ Segment Research City: Los Alamos Country/Region: United States System Year Vendor • Cores • Rmax (GFlop/s) • Rpeak (GFlop/s) *Crossroads - HPE Cray EX, Intel Xeon CPU Max 9480 56C 1.9 GHz, Slingshot-11 :2023 HPE • 660,800 • 30,034,700 • 40,176,640 *Trinity - Cray XC40, Intel Xeon E5-2698v3 16C 2.3 GHz, Intel Xeon Phi 7250 68C 1.4 GHz, Aries interconnect :2017 Cray/HPE • 979,072 • 20,158,700 • 41,461,150 *Cielo - Cray XE6, Opteron 6136 8C 2.4 GHz, Custom :2011 Cray/HPE • 142,272 • 1,110,000 • 1,365,811.2 *Cielo - Cray XE6 8-core 2.4 GHz :2010 Cray/HPE • 107,152 • 816,600 • 1,028,659.2 DOE/NNSA/LANL URL: http://www.lanl.gov/ Segment Research City: Los Alamos Country/Region: United States System Year Vendor • Cores • Rmax (GFlop/s) • Rpeak (GFlop/s) *Venado - HPE Cray EX254n, NVIDIA Grace 72C 3.1 GHz, NVIDIA GH200 Superchip, Slingshot-11 :2024 HPE • 481,440 • 98,510,000 • 130,444,750 *Roadrunner - BladeCenter QS22/LS21 Cluster, PowerXCell 8i 3.2 GHz / Opteron DC 1.8 GHz, Voltaire Infiniband :2009 IBM • 122,400 • 1,042,000 • 1,375,776 *Roadrunner - BladeCenter QS22/LS21 Cluster, PowerXCell 8i 3.2 GHz / Opteron DC 1.8 GHz, Voltaire Infiniband :2008 IBM • 129,600 • 1,105,000 • 1,456,704 *Cerrillos - BladeCenter QS22/LS21 Cluster, PowerXCell 8i 3.2 GHz / Opteron DC 1.8 GHz, Infiniband :2009 IBM • 14,400 • 126,500 • 161,856 Lawrence Livermore National Laboratory URL: http://www.llnl.gov/ Segment Research City: Livermore Country/Region: United States System Year Vendor • Cores • Rmax (GFlop/s) • Rpeak (GFlop/s) *Bengal (LLNL CTS-2) - Dell PowerEdge C6620, Intel Xeon Platinum 8480+ 56C 2.0 GHz, Cornelis Networks Omni-Path :2023 DELL, Linux/TOSS • 107,520 • 7,196,630 • 8,085,500 *Dane (LLNL CTS-2) - Dell PowerEdge C6620, Intel Xeon Platinum 8480+ 56C 2.0 GHz, Cornelis Networks Omni-Path :2023 DELL • 123,648 • 7,041,000 • 7,913,000 *Ruby (LLNL) - Supermicro SYS-2029TP-HTR, Intel Xeon Platnium 8276L 28C 2.2 GHz, Cornelis Networks Omni-Path :2020 Supermicro • 85,568 • 3,700,150 • 6,023,990 *Magma (LLNL/NNSA CTS-1) - Relion Cluster, Intel Xeon Platinum 9242 48C 2.3 GHz, Intel Omni-Path :2019 Penguin Computing • 62,400 • 3,241,240 • 4,592,640 *Quartz (LLNL CTS-1) - Tundra Extreme Scale, Intel Xeon E5-2695v4 18C 2.1 GHz, Intel Omni-Path :2016 Penguin Computing • 95,472 • 2,632,510 • 3,207,859.2 *Jade (LLNL/NNSA CTS-1) - Tundra Extreme Scale, Intel Xeon E5-2695v4 18C 2.1 GHz, Intel Omni-Path :2016 Penguin Computing • 95,472 • 2,632,510 • 3,207,859.2 *Nel (LLNL/NNSA CTS-1) - Tundra Extreme Scale, Intel Xeon E5-2695v4 18C 2.1 GHz, Intel Omni-Path :2017 Penguin Computing • 39,744 • 1,179,580 • 1,335,398 *Zin - Xtreme-X GreenBlade GB512X, Intel Xeon E5 (Sandy Bridge - EP) 8C 2.6 GHz, Infiniband QDR :2011 Cray/HPE • 46,208 • 773,700 • 961,126.4 *Cab - Xtreme-X , Intel Xeon E5-2670 8C 2.6 GHz, Infiniband QDR :2012 Cray/HPE • 20,480 • 347,400 • 425,984 *Sierra - Dell Xanadu 3 Cluster, Intel Xeon X5660 2.8 GHz, QLogic InfiniBand QDR :2010 DELL • 21,756 • 166,700 • 243,667 *Juno - Appro XtremeServer 1143H, Opteron QC 2.2 GHz, Infiniband :2008 Cray/HPE • 18,224 • 131,600 • 162,200 *Muir - Dell Xanadu 3 Cluster, Intel Xeon X5660 2.8 GHz, QLogic InfiniBand QDR :2010 DELL • 15,000 • 105,900 • 168,000 *Hera - Appro Xtreme-X3 Server, Quad Opteron Quad Core 2.3 GHz, Infiniband :2009 Cray/HPE • 13,552 • 102,200 • 127,200 *Edge - Appro GreenBlade Cluster, Intel Xeon X5660 6C 2.8 GHz, Infiniband QDR, NVIDIA 2050 :2010 Cray/HPE • 8,240 • 100,500 • 239,866 *Graph - Appro Xtreme-X3, Opteron 2.0 GHz, Infiniband DDR :2009 Cray/HPE • 13,440 • 83,080 • 107,500 *Coastal - Dell DCS Xanadu 2.5, Intel Xeon E55xx 2.4 GHz, Infiniband DDR :2010 DELL • 8,464 • 72,410 • 81,254 *Atlas - Appro Xtreme Server, Quad Opteron Dual Core 2.4 GHz Infiniband :2007 Cray/HPE • 9,216 • 36,620 • 44,240 *Hyperion - Dell DCS Xanadu 2.5, Intel Xeon E55xx 2.4 GHz, Infiniband DDR :2010 DELL 4,032 31,860 40,320 *Titan - Dell DCS Xanadu 2.5, Intel Xeon E55xx 2.4 GHz, Infiniband DDR :2010 DELL 4,032 31,860 40,320 *Ansel - Dell Xanadu 3 Cluster, Intel Xeon X5660 2.8 GHz, QLogic InfiniBand QDR :2010 DELL 3,480 31,790 38,976 *Minos - Appro Xtreme Server, Quad Opteron Dual Core 2.4 GHz Infiniband :2007 Cray/HPE 6,912 27,380 33,178 *Thunder - Intel Itanium 2 Tiger4 1.4 GHz, Quadrics :2004 California Digital Corporation 4,096 19,940 22,938 *Rhea - Appro Xtreme Server, Quad Opteron Dual Core 2.4 GHz Infiniband :2007 Cray/HPE 4,608 18,240 22,118 *Zeus - Appro Xtreme Server, Quad Opteron Dual Core 2.4 GHz Infiniband :2006 Cray/HPE 2,304 8,181 11,059.2 *Lilac - xSeries x335 Cluster, Intel Xeon 3.0 GHz, Quadrics :2004 IBM 1,540 6,232 9,425 *eServer Blue Gene Solution :2006 IBM 2,048 4,713 5,734 *Gauss - GraphStream Gen3, Opteron 2.4 GHz, Infiniband :2005 GraphStream 514 1,807 2,467 *Adelie - Intel Pentium 4 Xeon 2.8 GHz Cluster, Quadrics :2003 Promicro/Quadrics 256 1,036 1,433.6 *Emperor - Intel Pentium 4 Xeon 2.8 GHz Cluster, Quadrics :2003 Promicro/Quadrics 256 1,036 1,433.6
See also[edit]
Facts about "El Capitan (ATS-4) - Supercomputers"
| designer | AMD + and Cray + |
| introductory date | 2024 + |
| name | El Capitan + |
| operator | Lawrence Livermore National Laboratory + |
| peak flops (double-precision) | 2.746e+18 FLOPS (2.746e+15 KFLOPS, 2,746,000,000,000 MFLOPS, 2,746,000,000 GFLOPS, 2,746,000 TFLOPS, 2,746 PFLOPS, 2.746 EFLOPS, 0.00275 ZFLOPS) + |
| sponsor | United States Department of Energy (DOE) + |