Cray Research SV1ex-1

This Cray Research SV1ex-1 vector supercomputing system was recently donated by the Ford Motor Company, which had five SV1 systems that were used for safety and structural analysis. This SV1e model was introduced in June 19, 1998, runs the UNICOS operating system, and is in the same family as the Cray J90 systems. You could use the SV1 processor modules to upgrade a J916/J932 system.

This system includes 8 processor boards with 4 processors per board for a total of 32 processors. The processors are connected to the outside world with counter-rotating  GigaRings that can move data at 800 MBytes/Sec. This system is a little unusual because all 8 processor modules have GigaRing interfaces. It is an indication that this system had very high I/O requirements. It has 8 memory boards totaling 64 GB or RAM. 

The front of the Cray SV1. The cabinet in the middle holds the CPUs and memory.

The smaller cabinets to the left and right hold the I/O subsystems.

 

Jonathan White prepares the SV1 processor cabinet to come off the delivery truck. 

With the doors open you can see the CPUs and memory in the middle cabinet.

The rear side of the SV1. The monster blowers that you can see are for the Scalable I/O Node Subracks.

 

This is one of 8 8GB memory boards on the Cray. When this machine was introduced that was an enormous amount of memory.

The left PCB in this CPU module is the Gigaring interface.

The right PCB holds the quad processors. 

The right I/O Cabinet.

The Multipurpose Node (MPN) is at the top of the right cabinet. The MPN is a SPARC based controller that connects the GigaRing to Sbus interface boards. This one has a 100BASE-T Sbus board installed and an empty Sbus slot. The empty Sbus slot probably contained the required SCSI interface board.

The front and rear of the Cray Multipurpose Node (MPN-1).

The chassis contains a HyperSPARC processor running the VxWorks real-time operating system.

The MPN contains two SBuses so it can support up to 8 SBus I/O cards.

The only SBus card currently installed is for 100 Mb Ethernet.

It can support Ethernet, FDDI, ATM, SCSI disks, and Supervisory Channel SBus (SC01).

Click on the image for a larger view.

One of two SIO chassis. 

Both the left and the right cabinets contain one Scalable I/O (SIO) Node Subrack (NSR-1). The NSR-1 in the left cabinet contains 4x FibreChannel Node (FCN-1) and the NSR-1 in the right cabinet contains 1x FCN-1 FibreChannel node. Each FCN-1 has 5x copper Fibrechannel interfaces to connect to disk arrays. The right NSR-1 also contains 2x HIPPI Node (HPN-1 maybe HPN-2). These are used to connect to HIPPI network disk arrays at 100MB/sec or switched HIPPI networks taht can be used to connect to frame buffers. The right NSR-1 also contains 1x ESN-1 Enterprise Systems Connection Architecture (ESCON) interface. The ESN-1 provides an optical link to channel and control units that implement the Enterprise Systems Architecture/390 specification. The ESN-1 has four independent channels, each with a bandwidth of 17 Mbytes/S. This can be used to connect to IBM mainframe peripherals.