Sophia:Hardware

From Grid5000
Jump to: navigation, search


Summary

2 clusters, 89 nodes, 888 cores, 7.1 TFLOPS

Cluster Queue Date of arrival Nodes CPU Cores Memory Storage Network
suno default 2010-01-27 45 2 x Intel Xeon E5520 4 cores/CPU 32 GB 557 GB HDD 1 Gbps 
uvb default 2011-01-04 44 2 x Intel Xeon X5670 6 cores/CPU 96 GB 232 GB HDD 1 Gbps + 40 Gbps InfiniBand

Cluster details

suno

45 nodes, 90 cpus, 360 cores (json)

Model: Dell PowerEdge R410
Date of arrival: 2010-01-27
CPU: Intel Xeon E5520 (Nehalem, 2.27GHz, 2 CPUs/node, 4 cores/CPU)
Memory: 32 GB
Storage: 557 GB HDD SAS PERC 6/i Adapter (driver: megaraid_sas, path: /dev/disk/by-path/pci-0000:03:00.0-scsi-0:2:0:0)
Network:
  • eth0/eno1, Ethernet, configured rate: 1 Gbps, model: Broadcom NetXtreme II BCM5716 Gigabit Ethernet, driver: bnx2
  • eth1/eno2, Ethernet, model: Broadcom NetXtreme II BCM5716 Gigabit Ethernet, driver: bnx2 - unavailable for experiment

uvb

44 nodes, 88 cpus, 528 cores (json)

Model: Dell PowerEdge C6100
Date of arrival: 2011-01-04
CPU: Intel Xeon X5670 (Westmere, 2.93GHz, 2 CPUs/node, 6 cores/CPU)
Memory: 96 GB
Storage: 232 GB HDD SATA WDC WD2502ABYS-1 (driver: ahci, path: /dev/disk/by-path/pci-0000:00:1f.2-ata-1)
Network:
  • eth0/eno1, Ethernet, configured rate: 1 Gbps, model: Intel 82576 Gigabit Network Connection, driver: igb
  • eth1/eno2, Ethernet, model: Intel 82576 Gigabit Network Connection, driver: igb - unavailable for experiment
  • ib0, InfiniBand, configured rate: 40 Gbps, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core
  • ib1, InfiniBand, model: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE], driver: mlx4_core - unavailable for experiment
Last generated from the Grid'5000 Reference API on 2018-06-25 (commit 8ab85f540)