Who said you cant use commodity server hardware to build systems that are capable of tackling large-scale computational tasks? Appro International leverages off-the-shelf Intel hardware to build server-blade cluster systems that are designed to handle HPC (high-performance computation) applications.
The Hyperblade systems can be deployed to run home-grown applications performing such compute-intensive tasks as stock trading, seismic modeling and genome research. Accordingly, Appro is targeting financial institutions, government agencies, oil/gas industry interests and education centers with these systems.
I got a chance to see Hyperblades in action when I dropped by Appros headquarters in Milpitas, Calif. The systems, which are available now at a starting price of $145,000 for 80 blades, use standard Intel e7500 and e7501 chip sets to support dual Xeon processors and DDR memory. Each blade uses an industry-standard ATX motherboard, has a single ATA hard drive, dual embedded 10/100 or Gigabit NICs, and a single PCI-X slot.
Eighty dual-processor Hyperblades can fit in Appros customized cabinet. The cabinet is wider than a standard data center rack, so the Hyperblade doesnt necessarily offer better processor density than server blades from vendors such as Dell, HP and IBM.
IBMs BladeCenter, for example, holds 14 dual Xeon blades in a 7U form factor, meaning 84 blades can reside in a standard 42U rack.
Appro recently announced support for Infiniband, offering 10G-bps bi-directional throughput interconnections for HPC applications. The PCI-X slot can be used to hold a high-speed connection card, such as an Infiniband Host Channel Adapter.
The Hyperblade system I looked at had blades equipped with InfiniCon Systems Infiniband HCA. The system also used an InfiniCon switch to connect the blades to the Infiniband fabric.
Appro packages Hyperblades with Red Hat Linux and will be supporting AMDs new Opteron processors when they are released.
Is your company evaluating blade server technologies? Let me know at francis_chu@ziffdavis.com.