If (top != self) { window.location = 'about:blank'; }
NASA Logo, National Aeronautics and Space Administration
High-End Computing Program

+ Home > About Us > Facilities & Services > Computing Systems Overview

COMPUTING SYSTEMS OVERVIEW

This table shows the systems and related resources at the NASA Advanced Supercomputing (NAS) Facility and the NASA Center for Climate Simulation (NCCS).


Information about HEC Systems and Related Resources
  NAS NCCS
Systems

Pleiades

SGI ICE cluster

162 racks (11,280 nodes)

211,360 cores
5.33 petflops peak
3.38 petaflops LINPACK rating (November 2014)
723 terabytes of memory
Intel Xeon Westmere 5670 processors (2.93 GHz); Intel Xeon Westmere X5675 processors (3.06 GHz); Intel Xeon Sandy Bridge E5-2670 processors (2.6 GHz); Intel Xeon Ivy Bridge E5-2680v2 processors (2.8 GHz); and Intel Xeon Haswell E5-2680v3 (2.5 GHz) processors

GPU/Sandy Bridge and Westmere nodes:

4 racks (128 nodes; 1 GPU per node)

1,792 total cores (General Purpose Intel Xeon Sandy Bridge and Westmere Cores)
217,088 total GPU cores
320 teraflops, peak

 

Endeavour

2-node SGI UV 2000 system

1,536 cores
32 teraflops, peak

6 terabytes of memory

Intel Xeon E5-4650L Sandy Bridge processors (2.6 GHz)

 

Merope

36 racks (1,152 nodes)

13,824 cores
162 teraflops, peak

28 terabytes of memory

Intel Xeon X5670 Westmere processors (2.93 GHz)

Discover
Aggregate System:

67 racks (3,264 nodes)
79,200 cores

3.361 petaflops peak
340.992 terabytes of memory

Scalable Unit 8 = IBM iDataPlex Cluster System
7,680 cores
Intel Xeon Sandy Bridge (2.6 GHz)
28,800 Intel Xeon Phi coprocessor cores (Many Integrated Core–MIC)

Scalable Unit 9 = IBM iDataPlex Cluster System
7,680 cores
Intel Xeon Sandy Bridge (2.6 GHz)

Scalable Units 10, 11, and 12 = SGI Rackable System
62,840 cores
Intel Xeon Haswell (2.6 GHz)
103,680 NVIDIA Tesla K40 Graphics Processing Unit (GPU) streaming cores

Storage

Online:
25 petabytes of RAID disk capacity (combined total for all systems)

Archive Capacity:
126 petabytes

Online:
19.3 petabytes of RAID

Archive Capacity:
55 petabytes

Networking SGI NUMAlink
Voltaire InfiniBand
10-Gigabit Ethernet
1-Gigabit Ethernet

Mellanox Technologies InfiniBand
10-Gigabit Ethernet
1-Gigabit Ethernet
Visualization and Analysis

Hyperwall-2
128-screen tiled LCD wall arranged in 8x16 configuration
M easures 23-ft. wide by 10-ft. high
128 graphics processing units (Nvidia GeForce GTX 780 Ti)
128 teraflops, peak processing power
2,560 Intel Xeon E5-2680v2 (Ivy Bridge) cores (10-core)
57 teraflops, peak processing power
393 gigabytes of GDDR5 graphics memory
1.5 petabytes of storage

Data Exploration Theater
Visualization Wall/Hyperwall

15 Samsung UD55C 55-inch displays in 5x3 configuration
Measures 20 ft. wide by 6-ft.10-in. high
DVI connection
1920 x 1080 screen resolution @1080p

Visualization Wall/Hyperwall Cluster
16 Dell Precision WorkStation R5400s
2 dual-core Intel Xeon Harpertown processors per node
4 GB of memory per node
NVIDIA Quadro FX 1700 graphics
1 Gigabit Ethernet network connectivity
Control Station
One Dell FX100 Thin Client

Dali Data Analysis Nodes
IBM System x3950
272 Intel Xeon cores
24 NVIDIA Tesla M2070 GPUs with 10,752 “streaming GPU” CUDA cores
4.3 terabytes ofmemory
10-gigabit Ethernet network connectivity
Fibre channel access to the IBM GPFS file systems (~1 gigabyte/sec for large single stream file access)
NFS access to Dirac (archive) and data portal file systems

Data Portal

HP BladeSystem C7000
16 nodes, each containing:

  • 2 quad-core Intel Xeon 2.83 GHz processors
  • 8 gigabytes of memory

371 terabytes of network storage (GPFS managed)

NFS served to compute hosts

 

USA.gov NASA Logo - nasa.gov