Grenoble:Hardware

From Grid5000
Revision as of 14:55, 8 May 2018 by Lnussbaum (talk | contribs)
Jump to navigation Jump to search


Summary

Cluster Queue Date of arrival Nodes CPU Cores Memory Storage Network
edel default 2008-10-03 68 2 x Intel Xeon E5520 4 cores/CPU 24 GB [1,5-6,8-9,12,15-16,19,21,23-25,28-29,32,35,37,39-43,46,48-50,52,55-57,59,62-63,65-72]: 119 GB SSD
[2-4,7,10-11,13-14,17-18,20,22,26,30-31,33-34,36,38,44,47,51,53,58,60,64]: 59 GB SSD
1 Gbps + 40 Gbps InfiniBand
genepi default 2008-10-01 31 2 x Intel Xeon E5420 4 cores/CPU 8 GB 153 GB HDD 1 Gbps + 20 Gbps InfiniBand
yeti testing 2018-01-16 4 4 x Intel Xeon  16 cores/CPU 768 GB 446 GB SSD + 3 x 1.819 TB HDD 10 Gbps 

Cluster details

edel

68 nodes, 136 cpus, 544 cores, split as follows due to differences between nodes (json)

edel-[1,5-6,8-9,12,15-16,19,21,23-24,28-29,32,37,39-43,48-49,52,55-57,59,62,65-71] (36 nodes, 72 cpus, 288 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 Nehalem 2.27GHz (2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 119 GB SSD SATA C400-MTFDDAA128M (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp1s0f0, Ethernet (driver: igb), configured rate: 1 Gbps
  • eth1/enp1s0f1, Ethernet (driver: igb), configured rate: n/c - unavailable for experiment
  • ib0, InfiniBand (driver: mlx4_core), configured rate: 40 Gbps
  • ib1, InfiniBand (driver: mlx4_core), configured rate: n/c - unavailable for experiment

edel-[2-3,7,10-11,13-14,22,26,30,36,38,44,47,51,53,58,60,64] (19 nodes, 38 cpus, 152 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 Nehalem 2.27GHz (2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C400-MTFDDAA064M (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp1s0f0, Ethernet (driver: igb), configured rate: 1 Gbps
  • eth1/enp1s0f1, Ethernet (driver: igb), configured rate: n/c - unavailable for experiment
  • ib0, InfiniBand (driver: mlx4_core), configured rate: 40 Gbps
  • ib1, InfiniBand (driver: mlx4_core), configured rate: n/c - unavailable for experiment

edel-[17,34] (2 nodes, 4 cpus, 16 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 Nehalem 2.27GHz (2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C300-MTFDDAA064M (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp1s0f0, Ethernet (driver: igb), configured rate: 1 Gbps
  • eth1/enp1s0f1, Ethernet (driver: igb), configured rate: n/c - unavailable for experiment
  • ib0, InfiniBand (driver: mlx4_core), configured rate: 40 Gbps
  • ib1, InfiniBand (driver: mlx4_core), configured rate: n/c - unavailable for experiment

edel-[4,18,31,33] (4 nodes, 8 cpus, 32 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 Nehalem 2.27GHz (2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C400-MTFDDAA064M (driver: ahci, path: node-was-not-available-to-retrieve-this-value)
Network:
  • eth0/enp1s0f0, Ethernet (driver: igb), configured rate: 1 Gbps
  • eth1/enp1s0f1, Ethernet (driver: igb), configured rate: n/c - unavailable for experiment
  • ib0, InfiniBand (driver: mlx4_core), configured rate: 40 Gbps
  • ib1, InfiniBand (driver: mlx4_core), configured rate: n/c - unavailable for experiment

edel-[20] (1 node, 2 cpus, 8 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 Nehalem 2.27GHz (2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 59 GB SSD SATA C400-MTFDDAA064M (driver: ahci, path: /dev/disk/by-id/wwn-0x500a07510202c04b)
Network:
  • eth0/enp1s0f0, Ethernet (driver: igb), configured rate: 1 Gbps
  • eth1/enp1s0f1, Ethernet (driver: igb), configured rate: n/c - unavailable for experiment
  • ib0, InfiniBand (driver: mlx4_core), configured rate: 40 Gbps
  • ib1, InfiniBand (driver: mlx4_core), configured rate: n/c - unavailable for experiment

edel-[25,35,46,50,63,72] (6 nodes, 12 cpus, 48 cores)
Model: Bull bullx B500 compute blades
Date of arrival: 2008-10-03
CPU: Intel Xeon E5520 Nehalem 2.27GHz (2 CPUs/node, 4 cores/CPU)
Memory: 24 GB
Storage: 119 GB SSD SATA C400-MTFDDAA128M (driver: ahci, path: node-was-not-available-to-retrieve-this-value)
Network:
  • eth0/enp1s0f0, Ethernet (driver: igb), configured rate: 1 Gbps
  • eth1/enp1s0f1, Ethernet (driver: igb), configured rate: n/c - unavailable for experiment
  • ib0, InfiniBand (driver: mlx4_core), configured rate: 40 Gbps
  • ib1, InfiniBand (driver: mlx4_core), configured rate: n/c - unavailable for experiment

genepi

31 nodes, 62 cpus, 248 cores (json)

Model: Bull R422-E1
Date of arrival: 2008-10-01
CPU: Intel Xeon E5420 Harpertown 2.50GHz (2 CPUs/node, 4 cores/CPU)
Memory: 8 GB
Storage: 153 GB HDD SATA WDC WD1600YS-01S (driver: ata_piix, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/enp5s0f0, Ethernet (driver: e1000e), configured rate: n/c - unavailable for experiment
  • eth1/enp5s0f1, Ethernet (driver: e1000e), configured rate: 1 Gbps
  • ib0, InfiniBand (driver: mlx4_core), configured rate: 20 Gbps
  • ib1, InfiniBand (driver: mlx4_core), configured rate: n/c - unavailable for experiment

yeti (testing queue)

4 nodes, 16 cpus, 256 cores (json)

Model: Dell PowerEdge R940
Date of arrival: 2018-01-16
CPU: Intel Xeon Skylake 2.10GHz (4 CPUs/node, 16 cores/CPU)
Memory: 768 GB
Storage:
  • 446 GB SSD SAS PERC H740P Adp (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:2:0:0)
  • 1.819 TB HDD SAS PERC H740P Adp (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:2:1:0)
  • 1.819 TB HDD SAS PERC H740P Adp (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:2:2:0)
  • 1.819 TB HDD SAS PERC H740P Adp (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:18:00.0-scsi-0:2:3:0)
Network:
  • eth0/eno113, Ethernet (driver: i40e), configured rate: 10 Gbps
  • eth1/eno114, Ethernet (driver: i40e), configured rate: n/c - unavailable for experiment
  • eth2/eno115, Ethernet (driver: i40e), configured rate: n/c - unavailable for experiment
  • eth3/eno116, Ethernet (driver: i40e), configured rate: n/c - unavailable for experiment

Generated from the Grid5000 APIs on 2018-05-08