Grenoble:Hardware: Difference between revisions

From Grid5000
Jump to navigation Jump to search
No edit summary
No edit summary
Line 20: Line 20:
|[[#dahu|dahu]]||default||2018-03-22||32||2 x Intel Xeon Gold 6130||16 cores/CPU||192 GiB||data-sort-value="4396"|240 GB SSD + 480 GB SSD + 4.0 TB HDD||data-sort-value="110000"|10 Gbps + 100 Gbps Omni-Path
|[[#dahu|dahu]]||default||2018-03-22||32||2 x Intel Xeon Gold 6130||16 cores/CPU||192 GiB||data-sort-value="4396"|240 GB SSD + 480 GB SSD + 4.0 TB HDD||data-sort-value="110000"|10 Gbps + 100 Gbps Omni-Path
|-
|-
|[[#troll_.28testing_queue.29|troll]]||testing||2019-12-23||4||2 x Intel Xeon Gold 5218||16 cores/CPU||384 GiB + 1.5 TiB PMEM||data-sort-value="1937"|480 GB SSD + 1.6 TB SSD||data-sort-value="110000"|10 Gbps + 100 Gbps Omni-Path
|[[#troll|troll]]||default||2019-12-23||4||2 x Intel Xeon Gold 5218||16 cores/CPU||384 GiB + 1.5 TiB PMEM||data-sort-value="1937"|480 GB SSD + 1.6 TB SSD||data-sort-value="110000"|10 Gbps + 100 Gbps Omni-Path
|-
|-
|[[#yeti|yeti]]||default||2018-01-16||4||4 x Intel Xeon Gold 6130||16 cores/CPU||768 GiB||data-sort-value="7526"|480 GB SSD + 1.6 TB SSD + 3 x 2.0 TB HDD||data-sort-value="110000"|10 Gbps + 100 Gbps Omni-Path
|[[#yeti|yeti]]||default||2018-01-16||4||4 x Intel Xeon Gold 6130||16 cores/CPU||768 GiB||data-sort-value="7526"|480 GB SSD + 1.6 TB SSD + 3 x 2.0 TB HDD||data-sort-value="110000"|10 Gbps + 100 Gbps Omni-Path
Line 59: Line 59:
|}
|}


== troll (testing queue) ==
== troll ==


'''4 nodes, 8 cpus, 128 cores''' ([https://public-api.grid5000.fr/stable/sites/grenoble/clusters/troll/nodes.json?pretty=1 json])
'''4 nodes, 8 cpus, 128 cores''' ([https://public-api.grid5000.fr/stable/sites/grenoble/clusters/troll/nodes.json?pretty=1 json])
Line 157: Line 157:
* ib0, Omni-Path, configured rate: 100&nbsp;Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1<br/>
* ib0, Omni-Path, configured rate: 100&nbsp;Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1<br/>
|-
|-
|}''<small>Last generated from the Grid'5000 Reference API on 2020-01-29 ([https://github.com/grid5000/reference-repository/commit/9531180da commit 9531180da])</small>''
|}''<small>Last generated from the Grid'5000 Reference API on 2020-02-03 ([https://github.com/grid5000/reference-repository/commit/97d03f80b commit 97d03f80b])</small>''

Revision as of 09:19, 3 February 2020

Summary

3 clusters, 40 nodes, 1408 cores, 19.0 TFLOPS

Cluster Queue Date of arrival Nodes CPU Cores Memory Storage Network
dahu default 2018-03-22 32 2 x Intel Xeon Gold 6130 16 cores/CPU 192 GiB 240 GB SSD + 480 GB SSD + 4.0 TB HDD 10 Gbps + 100 Gbps Omni-Path
troll default 2019-12-23 4 2 x Intel Xeon Gold 5218 16 cores/CPU 384 GiB + 1.5 TiB PMEM 480 GB SSD + 1.6 TB SSD 10 Gbps + 100 Gbps Omni-Path
yeti default 2018-01-16 4 4 x Intel Xeon Gold 6130 16 cores/CPU 768 GiB 480 GB SSD + 1.6 TB SSD + 3 x 2.0 TB HDD 10 Gbps + 100 Gbps Omni-Path

Cluster details

dahu

32 nodes, 64 cpus, 1024 cores (json)

Model: Dell PowerEdge C6420
Date of arrival: 2018-03-22
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 192 GiB
Storage:
  • 240 GB SSD SATA Samsung MZ7KM240HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-3)
  • 480 GB SSD SATA Samsung MZ7KM480HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-4)
  • 4.0 TB HDD SATA Seagate ST4000NM0265-2DC (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-5)
Network:
  • eth0/enp24s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1

troll

4 nodes, 8 cpus, 128 cores (json)

Model: Dell PowerEdge R640
Date of arrival: 2019-12-23
CPU: Intel Xeon Gold 5218 (Cascade Lake-SP, 2.30GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 384 GiB + 1.5 TiB PMEM
Storage:
  • 1.6 TB SSD NVME Dell Dell Express Flash NVMe PM1725 1.6TB AIC (driver: nvme, path: /dev/disk/by-path/pci-0000:d8:00.0-nvme-1)
  • 480 GB SSD SATA Micron MTFDDAK480TDN (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:0:0)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Mellanox Technologies MT27710 Family [ConnectX-4 Lx], driver: mlx5_core
  • eth1/eno2, Ethernet, model: Mellanox Technologies MT27710 Family [ConnectX-4 Lx], driver: mlx5_core - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1

yeti

4 nodes, 16 cpus, 256 cores, split as follows due to differences between nodes (json)

yeti-[1-2,4] (3 nodes, 12 cpus, 192 cores)
Model: Dell PowerEdge R940
Date of arrival: 2018-01-16
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 4 CPUs/node, 16 cores/CPU)
Memory: 768 GiB
Storage:
  • 1.6 TB SSD NVME Dell Dell Express Flash NVMe PM1725 1.6TB AIC (driver: nvme, path: /dev/disk/by-path/pci-0000:6d:00.0-nvme-1)
  • 480 GB SSD SAS Intel SSDSC2KG480G7R (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:0:0)
  • 2.0 TB HDD SAS Seagate ST2000NX0463 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:1:0) (reservable)
  • 2.0 TB HDD SAS Seagate ST2000NX0463 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:2:0) (reservable)
  • 2.0 TB HDD SAS Seagate ST2000NX0463 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:3:0) (reservable)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/eno2, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth2/eno3, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth3/eno4, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1

yeti-3 (1 node, 4 cpus, 64 cores)
Model: Dell PowerEdge R940
Date of arrival: 2018-01-16
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 4 CPUs/node, 16 cores/CPU)
Memory: 768 GiB
Storage:
  • 1.6 TB SSD NVME Dell Dell Express Flash NVMe PM1725 1.6TB AIC (driver: nvme, path: /dev/disk/by-path/pci-0000:6d:00.0-nvme-1)
  • 480 GB SSD SAS Intel SSDSC2KG480G8R (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:0:0)
  • 2.0 TB HDD SAS Seagate ST2000NX0463 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:1:0) (reservable)
  • 2.0 TB HDD SAS Seagate ST2000NX0463 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:2:0) (reservable)
  • 2.0 TB HDD SAS Seagate ST2000NX0463 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:3:0) (reservable)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/eno2, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth2/eno3, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth3/eno4, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1

Last generated from the Grid'5000 Reference API on 2020-02-03 (commit 97d03f80b)