frontpage
  Breaking News
  Industry
  Recruitment
  e-commerce
  Software
  Hardware
  Communications
  Networking
  calendar

IT Jobs - over 4000 Jobs today
Over 4000 Jobs today
COLUMNS
Philipson
MacMan
Web Mechanic
Question Time
Open Line
Platform
Sites
Building NT
Tip Exchange
Screen Spirit
Silicon Valley


Computers and Technology for the Consumer and Small Business.


Platform column
Clusters serve up a challenge
Tuesday 31 August 1999
Rajkumar Buyya with the book he has just finished on cluster computing.
Picture: Eddie Jim IT

IF you have not heard of cluster computing by now perhaps you've been away - on Mars. The truth is that everyone is doing it, not just computer geeks in university computer science departments.

In its simplest form, a cluster is a bunch of standalone PCs or workstations connected via a network.

The applications that typically employ clusters are those that usually require supercomputers or highly reliable systems.

Clusters are something you can build yourself by using offtheshelf components and they are needed because user applications, whether they are scientific or commercial, are forever needing more computing resources.

For example, the scientific applications used in predicting lifethreatening situations such as earthquakes or hurricanes require enormous computational power. If you tried to predict an earthquake using a single PC, you would end up predicting it only after it had occurred - with all the inevitable consequences.

In the past, these applications would be run on supercomputers costing millions of dollars. The economics meant that few organisations had pockets deep enough to afford them. In fact, in countries such as the United Kingdom, such machines were only purchased on a national scale.

Another factor that made the purchase of such machines less appealing was their short lifespan - three to four years after the purchase the "antique" was ready to be put into a museum. Even if you wanted to upgrade them, the vendor may have gone bankrupt or else produced a new generation not compatible with the one you owned.

The primary reason for this is that the performance of commodity microprocessors and networks have improved so fast that PCs and workstations have eaten away the market share of proprietary supercomputers.

The availability of commodity highperformance microprocessors and highspeed networks has made the use of clusters an appealing vehicle for lowcost commodity supercomputing.

Today, operating systems such as the freely available Linux can be used to power cluster system. And Linuxbased clusters have already made great inroads into the marketplace.

One such system, known as Beowulf (www.beowulf.org), can be put together in a couple of days. You can download Linux from the Web and install it on two or more networked computers quite easily. The networks that you use can be Ethernet, GigabitEthernet, SCI, or Myrinet. If you configure one of these nodes as a fileserver, this becomes the master node, so the rest of your nodes act as slaves with the ability to access resources on the master.

You can download and install popular software environments such as MPI (Message Passing Interface) or PVM (Parallel Virtual Machine) and you will have a baby supercomputer that is ready to run your applications.

The complete recipe for building your own supercomputer is available at Cluster Computing Resources Page (www.dgs.monash.edu.au/ rajkumar/cluster/) which is part of the High Performance Cluster Computing book, published recently through Prentice Hall in the United States.

Do It Yourself clusterbased systems are currently rated as the bestin class machines on price/performance when compared with the offerings from IBM or SGI. A Beowulfclass system, called Cplant, at the Sandia National Laboratories, at Albuquerque, in the US, is rated as 129th most powerful computer in the world (www.top500.org).

Another advantage of clusters is that they are highly scalable and upgradable; you can always add more nodes - no matter how powerful the new nodes became - to your system to increase its overall ability.

Clusters are also ideal for running commercial applications. In a business environment, let's say a bank, all of its activities are automated. The problem is what happens if the server handling customer transactions fails? The bank's activities come to halt, so that customers cannot deposit or withdraw money from their account, or even access an ATM.

Such situations cause a great deal of inconvenience and can result in loss of business. This is where clusters can come to the rescue. A bank could continue to operate even after the failure of a server by automatically isolating failed components and migrating activities to alternative ones as a means of offering an uninterrupted service.

With the increased popularity of the web, system availability is becoming critical, especially where ecommerce is concerned. Popular freeemail service providers such as hotmail (www.hotmail.com) and searchengines such as Hotbot (www.hotbot.com) are powered by clusters which is why such Web services are highly available whenever you access them.

Clusters are used in many Australian research centres. Just to give you a glimpse of the use of clusters in Melbourne:

Monash University researchers use clusters as a testbed for developing nextgeneration highperformance computing software and running scientific simulation applications(hathor.cs. monash.edu.au).

RMIT researchers use clusters to develop a robust Internet web server (www.serc.rmit.edu.au).

Scientists at the Swinburne Centre for Astrophysics and Supercomputing (and also in Monash) use clusters for conducting research in computationally demanding problems in pulsar astrophysics (www.swin.edu.au/astronomy).

Clusters have been used for solving grand challenge applications such as weather modelling, automobile crash simulations, life sciences, computational fluid dynamics, nuclear simulations, image processing, electromagnetics, data mining, aerodynamics and astrophysics.

The IEEE Computer Society has recently formed a new task force on cluster computing led by Rajkumar Buyya (Monash University) and Mark Baker (Portsmouth University, UK), which will host the first international event dedicated to cluster computing in Melbourne on 23 December (www.dgs.monash.edu.au/ rajkumar/tfcc/IWCC99/). Rajkumar Buyya recently completed a second volume of High Performance Cluster Computing: Programming and Applications (Prentice Hall, USA). He is a research scholar at the School of Computer Science and Software Engineering, Monash University, Melbourne and Mark Baker is a faculty member at Portsmorth, UK.

| go to top |

  It's about having a say: each week someone in the IT industry will be invited to share a piece of their mind about an issue they feel strongly about.


This week's article
---
A clear reason to be cryptic
Information security is essential to protect your privacy on the Internet.
Previous Articles
---
Interact has lost its way
In 1995, Interact was established as a solutions based expo. It was part of the Victoria 21 policy and an important tool for introducing new media to the business, education and community sectors.
The converging road ahead
This is an edited text of a speech to Interact + PC IT's conference on new IT trends.
Clusters serve up a challenge
It's cheap, it's easy to install and it will still keep you running when your server isn't.
Embrace the business of knowledge
INCREASINGLY, governments are becoming aware of the importance of a knowledge based economy where education, technology and entrepreneurship are three of the most important creators of competitive advantage.
Dispense with IT deal doubts
Doing business in IT? Make sure all aspects are covered in a detailed computer contract.
Taking stock of cyber trading
A REVOLUTION is occurring in the financial services industry. After decades of the same market structure in Australia for stock and futures exchanges, a flurry of activity has occurred within the past two years.
 


Copyright © 1998 John Fairfax Holdings Ltd.
All rights reserved.