Grenoble:Hardware: Difference between revisions

From Grid5000
Jump to navigation Jump to search
No edit summary
No edit summary
Line 5: Line 5:


= Summary =
= Summary =
'''2 clusters, 36 nodes, 1280 cores, 17.1 TFLOPS'''
'''3 clusters, 40 nodes, 1408 cores, 17.1 TFLOPS'''
{|class="wikitable sortable"
{|class="wikitable sortable"
|-
|-
Line 19: Line 19:
|-
|-
|[[#dahu|dahu]]||default||2018-03-22||32||2 x Intel Xeon Gold 6130||16 cores/CPU||192 GiB||data-sort-value="4396"|240 GB SSD + 480 GB SSD + 4.0 TB HDD||data-sort-value="110000"|10 Gbps + 100 Gbps Omni-Path
|[[#dahu|dahu]]||default||2018-03-22||32||2 x Intel Xeon Gold 6130||16 cores/CPU||192 GiB||data-sort-value="4396"|240 GB SSD + 480 GB SSD + 4.0 TB HDD||data-sort-value="110000"|10 Gbps + 100 Gbps Omni-Path
|-
|[[#troll_.28testing_queue.29|troll]]||testing||2019-12-23||4||2 x Intel Xeon Gold 5218||16 cores/CPU||1.88 TiB||data-sort-value="1937"|480 GB SSD + 1.6 TB SSD||data-sort-value="10000"|10 Gbps 
|-
|-
|[[#yeti|yeti]]||default||2018-01-16||4||4 x Intel Xeon Gold 6130||16 cores/CPU||768 GiB||data-sort-value="7526"|480 GB SSD + 1.6 TB SSD + 3 x 2.0 TB HDD||data-sort-value="110000"|10 Gbps + 100 Gbps Omni-Path
|[[#yeti|yeti]]||default||2018-01-16||4||4 x Intel Xeon Gold 6130||16 cores/CPU||768 GiB||data-sort-value="7526"|480 GB SSD + 1.6 TB SSD + 3 x 2.0 TB HDD||data-sort-value="110000"|10 Gbps + 100 Gbps Omni-Path
Line 54: Line 56:
* <span style="color:grey">eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br />
* <span style="color:grey">eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment</span><br />
* ib0, Omni-Path, configured rate: 100&nbsp;Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1<br/>
* ib0, Omni-Path, configured rate: 100&nbsp;Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1<br/>
|-
|}
== troll (testing queue) ==
'''4 nodes, 8 cpus, 128 cores''' ([https://public-api.grid5000.fr/stable/sites/grenoble/clusters/troll/nodes.json?pretty=1 json])
{|
|-
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''Model:'''
| Dell PowerEdge R640<br/>
|-
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''Date of arrival:'''
| 2019-12-23<br/>
|-
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''CPU:'''
| Intel Xeon Gold 5218 (Cascade Lake-SP, 2.30GHz, 2&nbsp;CPUs/node, 16&nbsp;cores/CPU)<br/>
|-
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''Memory:'''
| 1.88&nbsp;TiB<br/>
|-
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''Storage:'''
|
* 1.6&nbsp;TB SSD NVME Dell Dell Express Flash NVMe PM1725 1.6TB AIC  (driver: nvme, path: /dev/disk/by-path/pci-0000:d8:00.0-nvme-1) <br />
* 480&nbsp;GB SSD SATA Micron MTFDDAK480TDN  (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:0:0) <br/>
|-
| valign="top" style="background-color: #f9f9f9; padding: 0px 10px 0px 3px;" |'''Network:'''
|
* eth0/eno1, Ethernet, configured rate: 10&nbsp;Gbps, model: Mellanox Technologies MT27710 Family [ConnectX-4 Lx], driver: mlx5_core<br />
* <span style="color:grey">eth1/eno2, Ethernet, model: Mellanox Technologies MT27710 Family [ConnectX-4 Lx], driver: mlx5_core - unavailable for experiment</span><br/>
|-
|-
|}
|}
Line 125: Line 156:
* ib0, Omni-Path, configured rate: 100&nbsp;Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1<br/>
* ib0, Omni-Path, configured rate: 100&nbsp;Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1<br/>
|-
|-
|}''<small>Last generated from the Grid'5000 Reference API on 2019-11-06 ([https://github.com/grid5000/reference-repository/commit/aa6baadea commit aa6baadea])</small>''
|}''<small>Last generated from the Grid'5000 Reference API on 2020-01-06 ([https://github.com/grid5000/reference-repository/commit/0d31e186b commit 0d31e186b])</small>''

Revision as of 16:54, 6 January 2020

Summary

3 clusters, 40 nodes, 1408 cores, 17.1 TFLOPS

Cluster Queue Date of arrival Nodes CPU Cores Memory Storage Network
dahu default 2018-03-22 32 2 x Intel Xeon Gold 6130 16 cores/CPU 192 GiB 240 GB SSD + 480 GB SSD + 4.0 TB HDD 10 Gbps + 100 Gbps Omni-Path
troll testing 2019-12-23 4 2 x Intel Xeon Gold 5218 16 cores/CPU 1.88 TiB 480 GB SSD + 1.6 TB SSD 10 Gbps 
yeti default 2018-01-16 4 4 x Intel Xeon Gold 6130 16 cores/CPU 768 GiB 480 GB SSD + 1.6 TB SSD + 3 x 2.0 TB HDD 10 Gbps + 100 Gbps Omni-Path

Cluster details

dahu

32 nodes, 64 cpus, 1024 cores (json)

Model: Dell PowerEdge C6420
Date of arrival: 2018-03-22
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 192 GiB
Storage:
  • 240 GB SSD SATA Samsung MZ7KM240HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-3)
  • 480 GB SSD SATA Samsung MZ7KM480HMHQ0D3 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-4)
  • 4.0 TB HDD SATA Seagate ST4000NM0265-2DC (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:00:11.5-ata-5)
Network:
  • eth0/enp24s0f0, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/enp24s0f1, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1

troll (testing queue)

4 nodes, 8 cpus, 128 cores (json)

Model: Dell PowerEdge R640
Date of arrival: 2019-12-23
CPU: Intel Xeon Gold 5218 (Cascade Lake-SP, 2.30GHz, 2 CPUs/node, 16 cores/CPU)
Memory: 1.88 TiB
Storage:
  • 1.6 TB SSD NVME Dell Dell Express Flash NVMe PM1725 1.6TB AIC (driver: nvme, path: /dev/disk/by-path/pci-0000:d8:00.0-nvme-1)
  • 480 GB SSD SATA Micron MTFDDAK480TDN (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:0:0)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Mellanox Technologies MT27710 Family [ConnectX-4 Lx], driver: mlx5_core
  • eth1/eno2, Ethernet, model: Mellanox Technologies MT27710 Family [ConnectX-4 Lx], driver: mlx5_core - unavailable for experiment

yeti

4 nodes, 16 cpus, 256 cores, split as follows due to differences between nodes (json)

yeti-[1-2,4] (3 nodes, 12 cpus, 192 cores)
Model: Dell PowerEdge R940
Date of arrival: 2018-01-16
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 4 CPUs/node, 16 cores/CPU)
Memory: 768 GiB
Storage:
  • 1.6 TB SSD NVME Dell Dell Express Flash NVMe PM1725 1.6TB AIC (driver: nvme, path: /dev/disk/by-path/pci-0000:6d:00.0-nvme-1)
  • 480 GB SSD SAS Intel SSDSC2KG480G7R (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:0:0)
  • 2.0 TB HDD SAS Seagate ST2000NX0463 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:1:0) (reservable)
  • 2.0 TB HDD SAS Seagate ST2000NX0463 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:2:0) (reservable)
  • 2.0 TB HDD SAS Seagate ST2000NX0463 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:3:0) (reservable)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/eno2, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth2/eno3, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth3/eno4, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1

yeti-3 (1 node, 4 cpus, 64 cores)
Model: Dell PowerEdge R940
Date of arrival: 2018-01-16
CPU: Intel Xeon Gold 6130 (Skylake, 2.10GHz, 4 CPUs/node, 16 cores/CPU)
Memory: 768 GiB
Storage:
  • 1.6 TB SSD NVME Dell Dell Express Flash NVMe PM1725 1.6TB AIC (driver: nvme, path: /dev/disk/by-path/pci-0000:6d:00.0-nvme-1)
  • 480 GB SSD SAS Intel SSDSC2KG480G8R (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:0:0)
  • 2.0 TB HDD SAS Seagate ST2000NX0463 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:1:0) (reservable)
  • 2.0 TB HDD SAS Seagate ST2000NX0463 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:2:0) (reservable)
  • 2.0 TB HDD SAS Seagate ST2000NX0463 (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:0:3:0) (reservable)
Network:
  • eth0/eno1, Ethernet, configured rate: 10 Gbps, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e
  • eth1/eno2, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth2/eno3, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • eth3/eno4, Ethernet, model: Intel Ethernet Controller X710 for 10GbE SFP+, driver: i40e - unavailable for experiment
  • ib0, Omni-Path, configured rate: 100 Gbps, model: Intel Omni-Path HFI Silicon 100 Series [discrete], driver: hfi1

Last generated from the Grid'5000 Reference API on 2020-01-06 (commit 0d31e186b)