Nancy:Hardware

From Grid5000
Revision as of 11:13, 22 March 2019 by Sphilippot (talk | contribs)
Jump to navigation Jump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.


Summary

9 clusters, 217 nodes, 4560 cores, 65.2 TFLOPS

Cluster Queue Date of arrival Nodes CPU Cores Memory Storage Network Accelerators
graoully production 2016-01-04 16 2 x Intel Xeon E5-2630 v3 8 cores/CPU 128 GiB 2 x 600 GB HDD 10 Gbps + 56 Gbps InfiniBand
graphique production 2015-05-12 6 2 x Intel Xeon E5-2620 v3 6 cores/CPU 64 GiB 299 GB HDD 10 Gbps + 56 Gbps InfiniBand 1: 2 x Nvidia Titan Black
[2-6]: 2 x Nvidia GTX 980
graphite default 2013-12-05 4 2 x Intel Xeon E5-2650 8 cores/CPU 256 GiB 2 x 300 GB SSD 10 Gbps + 56 Gbps InfiniBand Intel Xeon Phi 7120P
grcinq production 2013-04-09 48 2 x Intel Xeon E5-2650 8 cores/CPU 64 GiB 1.0 TB HDD 1 Gbps + 56 Gbps InfiniBand
grele production 2017-06-26 14 2 x Intel Xeon E5-2650 v4 12 cores/CPU 128 GiB 2 x 299 GB HDD 10 Gbps + 100 Gbps Omni-Path 2 x Nvidia GTX 1080 Ti
grimani production 2016-08-30 6 2 x Intel Xeon E5-2603 v3 6 cores/CPU 64 GiB 1.0 TB HDD 10 Gbps + 100 Gbps Omni-Path 2 x Nvidia Tesla K40M
grimoire default 2016-01-22 8 2 x Intel Xeon E5-2630 v3 8 cores/CPU 128 GiB 200 GB SSD + 5 x 600 GB HDD 4 x 10 Gbps + 56 Gbps InfiniBand
grisou default 2016-01-04 51 2 x Intel Xeon E5-2630 v3 8 cores/CPU 128 GiB 2 x 600 GB HDD [1-48]: 1 Gbps + 4 x 10 Gbps 
49: 4 x 10 Gbps 
[50-51]: 4 x 10 Gbps + 56 Gbps InfiniBand
grvingt production 2018-04-11 64 2 x Intel Xeon Gold 6130 16 cores/CPU 192 GiB 1.0 TB HDD 10 Gbps + 100 Gbps Omni-Path

Cluster details

graoully (production queue)

16 nodes, 32 cpus, 256 cores (json)

Model: Dell PowerEdge R630
Date of arrival: 2016-01-04
CPU: Intel Xeon E5-2630 v3 (Haswell, 2.40GHz, 2 CPUs/node, 8 cores/CPU)
Memory: 128 GiB
Storage:
  • 600 GB HDD SCSI ST600MM0088 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:0:0:0)
  • 600 GB HDD SCSI ST600MM0088 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:0:1:0)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Intel 82599ES 10-Gigabit SFI/SFP+ Network Connection, driver: ixgbe
  • eth1/eno2, Ethernet, model: Intel 82599ES 10-Gigabit SFI/SFP+ Network Connection, driver: ixgbe - unavailable for experiment
  • eth2/enp129s0f0, Ethernet, model: Intel Ethernet 10G 2P X520 Adapter, driver: ixgbe - unavailable for experiment
  • eth3/enp129s0f1, Ethernet, model: Intel Ethernet 10G 2P X520 Adapter, driver: ixgbe - unavailable for experiment
  • eth4/eno3, Ethernet, model: Intel I350 Gigabit Network Connection, driver: igb - unavailable for experiment
  • eth5/eno4, Ethernet, model: Intel I350 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 56 Gbps, model: Mellanox Technologies MT27500 Family [ConnectX-3], driver: mlx4_core

graphique (production queue)

6 nodes, 12 cpus, 72 cores, split as follows due to differences between nodes (json)

graphique-1 (1 node, 2 cpus, 12 cores)
Model: Dell PowerEdge R720
Date of arrival: 2015-05-12
CPU: Intel Xeon E5-2620 v3 (Haswell, 2.40GHz, 2 CPUs/node, 6 cores/CPU)
Memory: 64 GiB
Storage: 299 GB HDD SCSI PERC H330 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:2:0:0)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Broadcom NetXtreme II BCM57800 1/10 Gigabit Ethernet, driver: bnx2x
  • eth1/eno2, Ethernet, model: Broadcom NetXtreme II BCM57800 1/10 Gigabit Ethernet, driver: bnx2x - unavailable for experiment
  • eth2/eno3, Ethernet, model: Broadcom NetXtreme II BCM57800 1/10 Gigabit Ethernet, driver: bnx2x - unavailable for experiment
  • eth3/eno4, Ethernet, model: Broadcom NetXtreme II BCM57800 1/10 Gigabit Ethernet, driver: bnx2x - unavailable for experiment
  • ib0, InfiniBand, configured rate: 56 Gbps, model: Mellanox Technologies MT27500 Family [ConnectX-3], driver: mlx4_core
GPU: 2 x Nvidia Titan Black

graphique-[2-5] (4 nodes, 8 cpus, 48 cores)
Model: Dell PowerEdge R720
Date of arrival: 2015-05-12
CPU: Intel Xeon E5-2620 v3 (Haswell, 2.40GHz, 2 CPUs/node, 6 cores/CPU)
Memory: 64 GiB
Storage: 299 GB HDD SCSI PERC H330 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:2:0:0)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Broadcom NetXtreme II BCM57800 1/10 Gigabit Ethernet, driver: bnx2x
  • eth1/eno2, Ethernet, model: Broadcom NetXtreme II BCM57800 1/10 Gigabit Ethernet, driver: bnx2x - unavailable for experiment
  • eth2/eno3, Ethernet, model: Broadcom NetXtreme II BCM57800 1/10 Gigabit Ethernet, driver: bnx2x - unavailable for experiment
  • eth3/eno4, Ethernet, model: Broadcom NetXtreme II BCM57800 1/10 Gigabit Ethernet, driver: bnx2x - unavailable for experiment
  • ib0, InfiniBand, configured rate: 56 Gbps, model: Mellanox Technologies MT27500 Family [ConnectX-3], driver: mlx4_core
GPU: 2 x Nvidia GTX 980

graphique-6 (1 node, 2 cpus, 12 cores)
Model: Dell PowerEdge R720
Date of arrival: 2015-05-12
CPU: Intel Xeon E5-2620 v3 (Haswell, 2.40GHz, 2 CPUs/node, 6 cores/CPU)
Memory: 64 GiB
Storage: 299 GB HDD SCSI PERC H330 Mini (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:2:0:0)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Broadcom NetXtreme II BCM57800 1/10 Gigabit Ethernet, driver: bnx2x
  • eth1/eno2, Ethernet, model: Broadcom NetXtreme II BCM57800 1/10 Gigabit Ethernet, driver: bnx2x - unavailable for experiment
  • eth2/eno3, Ethernet, model: Broadcom NetXtreme II BCM57800 1/10 Gigabit Ethernet, driver: bnx2x - unavailable for experiment
  • eth3/eno4, Ethernet, model: Broadcom NetXtreme II BCM57800 1/10 Gigabit Ethernet, driver: bnx2x - unavailable for experiment
  • ib0, InfiniBand, configured rate: 56 Gbps, model: Mellanox Technologies MT27500 Family [ConnectX-3], driver: mlx4_core
GPU: 2 x Nvidia GTX 980

graphite

4 nodes, 8 cpus, 64 cores (json)

Model: Dell PowerEdge R720
Date of arrival: 2013-12-05
CPU: Intel Xeon E5-2650 (Sandy Bridge, 2.00GHz, 2 CPUs/node, 8 cores/CPU)
Memory: 256 GiB
Storage:
  • 300 GB SSD SATA II INTEL SSDSC2BB30 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:0:0:0)
  • 300 GB SSD SATA II INTEL SSDSC2BB30 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:0:1:0)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Intel 82599ES 10-Gigabit SFI/SFP+ Network Connection, driver: ixgbe
  • eth1/eno2, Ethernet, model: Intel 82599ES 10-Gigabit SFI/SFP+ Network Connection, driver: ixgbe - unavailable for experiment
  • eth2/eno3, Ethernet, model: Intel I350 Gigabit Network Connection, driver: igb - unavailable for experiment
  • eth3/eno4, Ethernet, model: Intel I350 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 56 Gbps, model: Mellanox Technologies MT27500 Family [ConnectX-3], driver: mlx4_core
Xeon Phi: Intel Xeon Phi 7120P

grcinq (production queue)

48 nodes, 96 cpus, 768 cores, split as follows due to differences between nodes (json)

grcinq-[1,5,18,30,46,48] (6 nodes, 12 cpus, 96 cores)
Model: Dell PowerEdge C6220
Date of arrival: 2013-04-09
CPU: Intel Xeon E5-2650 (Sandy Bridge, 2.00GHz, 2 CPUs/node, 8 cores/CPU)
Memory: 64 GiB
Storage: 1.0 TB HDD SATA ST1000NM0033-9ZM (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/eno1, Ethernet, configured rate: 1 Gbps, model: Intel I350 Gigabit Network Connection, driver: igb
  • eth1/eno2, Ethernet, model: Intel I350 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 56 Gbps, model: Mellanox Technologies MT27500 Family [ConnectX-3], driver: mlx4_core

grcinq-[2-4,6-17,19-29,31-45,47] (42 nodes, 84 cpus, 672 cores)
Model: Dell PowerEdge C6220
Date of arrival: 2013-04-09
CPU: Intel Xeon E5-2650 (Sandy Bridge, 2.00GHz, 2 CPUs/node, 8 cores/CPU)
Memory: 64 GiB
Storage: 1.0 TB HDD SATA WDC WD1003FBYX-1 (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/eno1, Ethernet, configured rate: 1 Gbps, model: Intel I350 Gigabit Network Connection, driver: igb
  • eth1/eno2, Ethernet, model: Intel I350 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 56 Gbps, model: Mellanox Technologies MT27500 Family [ConnectX-3], driver: mlx4_core

grele (production queue)

14 nodes, 28 cpus, 336 cores (json)

Model: Dell PowerEdge R730
Date of arrival: 2017-06-26
CPU: Intel Xeon E5-2650 v4 (Broadwell, 2.20GHz, 2 CPUs/node, 12 cores/CPU)
Memory: 128 GiB
Storage:
  • 299 GB HDD SAS PERC H730 Mini (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:2:0:0)
  • 299 GB HDD SAS PERC H730 Mini (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:2:1:0)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Intel 82599ES 10-Gigabit SFI/SFP+ Network Connection, driver: ixgbe
  • eth1/eno2, Ethernet, model: Intel 82599ES 10-Gigabit SFI/SFP+ Network Connection, driver: ixgbe - unavailable for experiment
  • eth2/eno3, Ethernet, model: Intel I350 Gigabit Network Connection, driver: igb - unavailable for experiment
  • eth3/eno4, Ethernet, model: Intel I350 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1
GPU: 2 x Nvidia GTX 1080 Ti

grimani (production queue)

6 nodes, 12 cpus, 72 cores (json)

Model: Dell PowerEdge R730
Date of arrival: 2016-08-30
CPU: Intel Xeon E5-2603 v3 (Haswell, 1.60GHz, 2 CPUs/node, 6 cores/CPU)
Memory: 64 GiB
Storage: 1.0 TB HDD SATA ST1000NX0423 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:0:0:0)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Intel 82599ES 10-Gigabit SFI/SFP+ Network Connection, driver: ixgbe
  • eth1/eno2, Ethernet, model: Intel 82599ES 10-Gigabit SFI/SFP+ Network Connection, driver: ixgbe - unavailable for experiment
  • eth2/eno3, Ethernet, model: Intel I350 Gigabit Network Connection, driver: igb - unavailable for experiment
  • eth3/eno4, Ethernet, model: Intel I350 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1
GPU: 2 x Nvidia Tesla K40M

grimoire

8 nodes, 16 cpus, 128 cores (json)

Model: Dell PowerEdge R630
Date of arrival: 2016-01-22
CPU: Intel Xeon E5-2630 v3 (Haswell, 2.40GHz, 2 CPUs/node, 8 cores/CPU)
Memory: 128 GiB
Storage:
  • 600 GB HDD SCSI ST600MM0088 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:0:0:0)
  • 600 GB HDD SCSI ST600MM0088 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:0:1:0)
  • 600 GB HDD SCSI ST600MM0088 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:0:2:0)
  • 600 GB HDD SCSI ST600MM0088 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:0:3:0)
  • 600 GB HDD SCSI ST600MM0088 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:0:4:0)
  • 200 GB SSD SCSI PX02SSF020 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:0:5:0)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Intel 82599ES 10-Gigabit SFI/SFP+ Network Connection, driver: ixgbe
  • eth1/eno2, Ethernet, configured rate: 10 Gbps, model: Intel 82599ES 10-Gigabit SFI/SFP+ Network Connection, driver: ixgbe
  • eth2/enp129s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet 10G 2P X520 Adapter, driver: ixgbe
  • eth3/enp129s0f1, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet 10G 2P X520 Adapter, driver: ixgbe
  • eth4/eno3, Ethernet, model: Intel I350 Gigabit Network Connection, driver: igb - unavailable for experiment
  • eth5/eno4, Ethernet, model: Intel I350 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 56 Gbps, model: Mellanox Technologies MT27500 Family [ConnectX-3], driver: mlx4_core

grisou

51 nodes, 102 cpus, 816 cores, split as follows due to differences between nodes (json)

grisou-[1-48] (48 nodes, 96 cpus, 768 cores)
Model: Dell PowerEdge R630
Date of arrival: 2016-01-04
CPU: Intel Xeon E5-2630 v3 (Haswell, 2.40GHz, 2 CPUs/node, 8 cores/CPU)
Memory: 128 GiB
Storage:
  • 600 GB HDD SCSI ST600MM0088 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:0:0:0)
  • 600 GB HDD SCSI ST600MM0088 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:0:1:0)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Intel 82599ES 10-Gigabit SFI/SFP+ Network Connection, driver: ixgbe
  • eth1/eno2, Ethernet, configured rate: 10 Gbps, model: Intel 82599ES 10-Gigabit SFI/SFP+ Network Connection, driver: ixgbe
  • eth2/enp3s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet 10G 2P X520 Adapter, driver: ixgbe
  • eth3/enp3s0f1, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet 10G 2P X520 Adapter, driver: ixgbe
  • eth4/eno3, Ethernet, configured rate: 1 Gbps, model: Intel I350 Gigabit Network Connection, driver: igb
  • eth5/eno4, Ethernet, model: Intel I350 Gigabit Network Connection, driver: igb - unavailable for experiment

grisou-49 (1 node, 2 cpus, 16 cores)
Model: Dell PowerEdge R630
Date of arrival: 2016-01-04
CPU: Intel Xeon E5-2630 v3 (Haswell, 2.40GHz, 2 CPUs/node, 8 cores/CPU)
Memory: 128 GiB
Storage:
  • 600 GB HDD SCSI ST600MM0088 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:0:0:0)
  • 600 GB HDD SCSI ST600MM0088 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:0:1:0)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Intel 82599ES 10-Gigabit SFI/SFP+ Network Connection, driver: ixgbe
  • eth1/eno2, Ethernet, configured rate: 10 Gbps, model: Intel 82599ES 10-Gigabit SFI/SFP+ Network Connection, driver: ixgbe
  • eth2/enp3s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet 10G 2P X520 Adapter, driver: ixgbe
  • eth3/enp3s0f1, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet 10G 2P X520 Adapter, driver: ixgbe
  • eth4/eno3, Ethernet, model: Intel I350 Gigabit Network Connection, driver: igb - unavailable for experiment
  • eth5/eno4, Ethernet, model: Intel I350 Gigabit Network Connection, driver: igb - unavailable for experiment

grisou-[50-51] (2 nodes, 4 cpus, 32 cores)
Model: Dell PowerEdge R630
Date of arrival: 2016-01-04
CPU: Intel Xeon E5-2630 v3 (Haswell, 2.40GHz, 2 CPUs/node, 8 cores/CPU)
Memory: 128 GiB
Storage:
  • 600 GB HDD SCSI ST600MM0088 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:0:0:0)
  • 600 GB HDD SCSI ST600MM0088 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:02:00.0-scsi-0:0:1:0)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Intel 82599ES 10-Gigabit SFI/SFP+ Network Connection, driver: ixgbe
  • eth1/eno2, Ethernet, configured rate: 10 Gbps, model: Intel 82599ES 10-Gigabit SFI/SFP+ Network Connection, driver: ixgbe
  • eth2/enp129s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet 10G 2P X520 Adapter, driver: ixgbe
  • eth3/enp129s0f1, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet 10G 2P X520 Adapter, driver: ixgbe
  • eth4/eno3, Ethernet, model: Intel I350 Gigabit Network Connection, driver: igb - unavailable for experiment
  • eth5/eno4, Ethernet, model: Intel I350 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 56 Gbps, model: Mellanox Technologies MT27500 Family [ConnectX-3], driver: mlx4_core

grvingt (production queue)

64 nodes, 128 cpus, 2048 cores (json)

Model: Dell PowerEdge C6420
Date of arrival: 2018-04-11
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 192 GiB
Storage: 1.0 TB HDD SATA ST1000NX0443 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-3)
Network:
  • eth0/enp24s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1

Last generated from the Grid'5000 Reference API on 2019-03-22 (commit a2028cf)