Grenoble:Hardware

From Grid5000
Revision as of 18:05, 10 December 2018 by Pneyron (talk | contribs)
Jump to navigation Jump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.


Summary

2 clusters, 36 nodes, 1280 cores, 17.1 TFLOPS

Cluster Queue Date of arrival Nodes CPU Cores Memory Storage Network
dahu default 2018-03-22 32 2 x Intel Xeon Gold 6130 16 cores/CPU 192 GiB 240 GB SSD + 480 GB SSD + 4.0 TB HDD 10 Gbps + 100 Gbps Omni-Path
yeti default 2018-01-16 4 4 x Intel Xeon Gold 6130 16 cores/CPU 768 GiB 480 GB SSD + 2 x 1.6 TB SSD + 3 x 2.0 TB HDD 10 Gbps + 100 Gbps Omni-Path

Cluster details

dahu

32 nodes, 64 cpus, 1024 cores (json)

Model: Dell PowerEdge C6420
Date of arrival: 2018-03-22
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 192 GiB
Storage:
  • 240 GB SSD SATA MZ7KM240HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-3)
  • 480 GB SSD SATA MZ7KM480HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-4)
  • 4.0 TB HDD SATA ST4000NM0265-2DC (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-5)
Network:
  • eth0/enp24s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1

yeti

4 nodes, 16 cpus, 256 cores (json)

Model: Dell PowerEdge R940
Date of arrival: 2018-01-16
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 4 CPUs/node, 16 cores/CPU)
Memory: 768 GiB
Storage:
  • 1.6 TB SSD NVME Dell Express Flash NVMe PM1725 1.6TB AIC (driver: nvme, path: /dev/disk/by-path/pci-0000:6d:00.0-nvme-1)
  • 1.6 TB SSD NVME Dell Express Flash NVMe PM1725 1.6TB AIC (driver: nvme, path: /dev/disk/by-path/pci-0000:6e:00.0-nvme-1)
  • 480 GB SSD SAS SSDSC2KG480G7R (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:0:0)
  • 2.0 TB HDD SAS ST2000NX0463 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:1:0)
  • 2.0 TB HDD SAS ST2000NX0463 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:2:0)
  • 2.0 TB HDD SAS ST2000NX0463 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:3:0)
Network:
  • eth0/eno113, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/eno114, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth2/eno115, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth3/eno116, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1

Last generated from the Grid'5000 Reference API on 2018-12-10 (commit 043445bbb)