Why super computing rocks

Why super computing rocks

Fujitsu is driving awareness of the potential of supercomputing with its K and FX10 giants - the Australian National University is one of the beneficiaries

It’s the equivalent of having all seven billion people on Earth each armed with a calculator, processing instructions for 17 days straight; Fujitsu’s K – until recently the world’s fastest supercomputer – could do that in one second.

K, named after the Japanese word for “10 quadrillion” boasts 10.51 petaflops with a system that includes 864 racks and 88, 128 CPUs. With backing from Japan’s Riken Advanced Institute for Computational Science in Kobe, it is being deployed towards ambitious and novel goals – from predicting earthquakes and tsunamis in Japan to modelling the human genome to exploring our cosmic origins with the help of a telescope in the Chilean desert.

A much slimmer version is making inroads in Australia, enabling the country’s climate and research efforts. And Fujitsu is at the helm of this initiative. The company recently won a bid to provide Australian National University supercomputing that will aid advanced research in such areas as climate modelling, astronomy, physics, and geosciences.

The project, which also includes a datacentre, is worth $50 million over the next fouryears with Fujitsu’s contract worth $25 million, said ANU’s Pro Vice-Chancellor (E-Strategies), Robin Stanton. The project got its legs after the 2009 Federal Budget allocated funding under the government’s Super Science Initiative aimed at promoting climate and space research, and future industries.

Fujitsu will deploy its PRIMEHPC FX10 supercomputer, which is derived from technology in its K Computer, that will be operational in January; FX’s deployed capacity will equates to running 57,000 computers at the same time, with a petascale performance of 1.2 petaflops, or 1,200,000,000,000,000 floating point operations per second, and 12 petabytes storage capacity.

It will be used by researchers for a variety of purposes from climate studies and prediction, to quantum dynamics, modelling the Great Barrier Reef, monitoring nutrients and pollutants run-off from the continent, Stanton said.

Australia’s other supercomputing efforts include iVEC, a joint venture between CSIRO, Curtin University, Edith Cowan University, and the Western Australian Government among others. That program will support nanotechnology, radioastronomy, mining and petroleum, multimedia, and urban planning. While IBM is supporting Victorian Life Sciences Computation Initiative, which is backed by the Victorian Government and the University of Melbourne, for advanced life sciences-related research work.

Fujitsu’s sees more untapped sectors in Australia that could benefit from supercomputing ultimately. One such area is energy.

“Things like just making sure our energy is distributed appropriately – that major grids of energy supply and PIK loads do not have shortages in some areas and overloads in another - that generates massive amounts of data and needs massive amount of processing, that is one potential application,” chief technology officer, Fujitsu A/NZ, Craig Baty, said.

Another application is geographical surveys that are full of massive amounts of data that comes from satellites, with the potential aim to look at mineral deposits.

“Where should you dig and tap for oil, or modelling ice flows in the ocean,” he said.

With the Australian population exceeding 20 million, health care is another area packed with data that can be processed a lot quicker to expedite drug research.

“Wherever you have people you are going to have lots of data,” Baty said.

Another potential commercial application is tracking movements and understanding how people use things, tapping into the trend of big data. With the advent of the National Broadband Network, the data collection will be on the increase that will need more processing power.

“The volume of all those things and phenomena requires huge amounts of petascale storage and high performance computing,” Baty said.

Supercomputers, in general, have long caught the fancy of the technology world and the race to have the fastest, in many ways, has resembled the heated space race.

Fast supercomputer

Just last month, for the first time in over two years, the US reclaimed the title of fastest supercomputer, toppling Fujitsu’s thanks to the processing power of Sequoia, a supercomputer developed by IBM. It clocked 16.32 petaflops – trillions of floating-point calculations per second – using more than 1.5m processor cores. According to other media reports, China and Russia are also in the race with their own supercomputers.

With K, Fujitsu, in the last two years, has made a renewed effort to make its supercomputing products commercially available internationally outside Japan, noted Baty.

Fujitsu calls its supercomputing program “human-centric intelligent computing”. This largely spans mapping meteorological data, health monitoring, predicting tsunamis and earthquakes, among other areas. Its other flagship projects include a supercomputer for ALMA, the world’s largest radio telescope that is positioned about five kilometres above sea level in the Chilean desert to explore Earth’s cosmic origins.

Humanitarian effort

One humanitarian effort is tsunami modelling and predicting the chance of tsunami for appropriate evacuation efforts.

“Existing forms of computing can only predict tsunamis within a few kilometres where tsunami will hit and it is not very fast. You get a very short notice to evacuate. The objective is to narrow it down to pin point the accuracy and evacuation and safety efforts to do it more effectively,” said Baty.

Tsunami modelling requires complex processing of ocean currents.

“Things like a massive bank of plankton in the path of the wave can change the direction subtly in one area and by the time it gets to land it could be many kilometres out of whack, things like rocks or sediment can change the way the tsunami moves, and modelling that takes a lot of power,” he said.

Elsewhere in the world, for instance, Fujitsu’s supercomputing is helping the Central Weather Bureau of Taiwan for use in numerical weather predictions. In addition to daily weather forecasting for Taiwan, it will improve the country’s ability to monitor and forecast typhoons, tropical storms and other meteorological hazards - or Taiwan, given its location on the border of tropical and subtropical climate zones.

Another key area is modelling the human genome, to aid DNA deconstruction research, working out how medicines impacts the human body, and related genetics, to help get new drugs faster.

Fujitsu’s supercomputing program is part of its overall strategy to focus on the Cloud and high performance computing efforts, to tap several high-growth areas. Ultimately, data in the Cloud and supercomputing could be combined for more applications, according to Baty,

Fujitsu Australia’s head of strategy, Philip McCormack said, “Super Computing is helping make the seemingly impossible possible. Historically, these computers have been used by academia. However moving forward you will see that individuals will be able to access these machines , taking a slice of the capacity and accessing the compute power.

“Bringing the might of the supercomputer to the masses will be a major step forward.”

Follow Us

Join the newsletter!


Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.


Show Comments