Thursday, October 22, 2015

US and China compete at the race of supercomputers

US and China compete at the race of supercomputers
A trillion is a one followed by 18 zeros, which measured in kilometers Milky Way from beginning to end. A trillion is also more than twice as many seconds the entire history of the universe. Obama has just decreed that in a few years, a trillion, also the number of supercomputer calculations made in just a second. At the end of his mandate, the US president chest out to China now leads the ranking of supercomputers with its Tianhe-2, leaving a legacy creation Initiative National Strategic Computing, which will be born biggest computer ever built.

The new computer will inaugurate the era of exaflop, the measurement unit that measures the one followed by 18 zeros operations a second. And they may be rubbing their hands researchers studying extreme weather phenomena, such as the current heat wave plaguing Spain, the overall climate change and biomedicine, in addition to vehicle designers and scientists of big data. Military and energy management will also participate, through their respective agencies in the development of the American project.

The impact on the study of new drugs is one of the most obvious. "Easy drugs that cure grass field, have already been found. Now almost all compounds coming to market have come out of a computer" illustrates Modesto Orozco, scientist at the Institute of Biomedical Research of Barcelona. "Pharmaceutical stored in their libraries million molecules not easy to analyze experimentally without large computers. You have to try them one by one, but also in combination with others. We can not use a billion mice," says Orozco. By some estimates, it will pass the test 100,000 molecules present the analysis billion a year. "Medicine will leave you looking more to engineering. New machines, bigger, have an impact" transverse, especially in complex diseases such as cancer, [to analyze] the synergistic effect of drugs. "This is not only to analyze a drug or combination, but also how each patient reacts particular way by genomic profile. "We therapies. We want to know why there are very good drugs for 90% of the population but very harmful by 5%. Those drugs now not hit the market because they exceed the approval of the drug agency, very conservative, but that would not be so if we had the possibility to simulate in detail specifically how it affects each person's profile. "

For Orozco, Obama's announcement "is like when Kennedy said he had to go to the moon. Building owns the machine, but what is beyond the effort required to do it." The biomedical researcher is also the director of the Department of Life Sciences National Supercomputing Center (BSC-CNS), which houses the only supercomputer in Spain: the MareNostrum. It is 15,000 times less potent than projected in a pattern, the Linpack program, which is run on all these big computers to measure their speed equal. Still, MareNostrum is essential to 3,000 projects of research centers, universities and companies like Repsol and Iberdrola resource.

Its director, Mateo Valero, also put in place the ad. "The most important exaflop but is not the first time that the United States wants to work together integrated three of its agencies (Defense, Energy and the National Science Foundation), with large companies and universities. 'A country that does not compute not compete ', they say there. "

None of the experts consulted dares to give a date earlier than 2022 or 2025 for the start of operation of such a device. Its cost will be released in a few months. For reference, a project already underway, CORAL, intends to build three supercomputers with a capacity of 150 petaflops each, for about $ 525 million in total. The current project would be six times more powerful.

A new frontier
30 years supercomputers have been performing a curious rule. Every ten years have increased their speed a thousandfold. In addition, the new frontiers (the megaflops to gigaflops of gigaflops and teraflops to petaflops thence) have been met in years ending in 8. This is the first time that unwritten rule no longer met: it is impossible for 2018 capacity machines have exaflops.

The processors are reaching the physical limits of miniaturization. If you want to get that computing power, the only remedy is to include more and more processors on the supercomputer. The thiane-2 supercomputer includes 6 million of these processors -a personal computer is enough one- versus 100 million would need a new one. "The technical challenge is great," said Valero, "a task must be divided among the 100 million run". The current hardware and software are not enough, nor the human element. "No company alone can make them today." Neither the training of programmers. Hence the United States has met all its forces, the university administration and enterprise.

It is not the only difficulty facing the project. The new supercomputer will be able to predict with a level of detail unprecedented reliability and the future of Earth's climate, and well forward in time: the end of this century. That goal may seem paradoxical: you need a lot of energy to run a model of unsustainable consumption on the environment. His younger brother, MareNostrum, consumes 1 megawatt per year, 1.5 megawatt taking into account the energy required to cool their processors. In addition to the environmental cost it is economic: assume 1.5 million euros in electricity bills. The new supercomputer, with current technology, "need more than 500 megawatts to operate, without the need to cool" adventure responsible for MareNostrum. To give a term of comparison, 500 megawatts is half the energy to produce a Spanish nuclear plant a year.

The medicine will be looking more to engineering "
A exaflop era is ahead of you green computing, green computing, more efficient processors in energy consumption. Researchers are working to reduce this consumption to 50, even 20 megawatts per year. A current project, also in the USA, the Summit supercomputer will use only 10% more energy than the giant Titan, the second fastest computer in the world now, but multiplied by five to ten times their capacity.

Good news for the environment, which adds to the hopes of researchers in climate change with the machine announced. Friederike Otto of the University of Oxford, coordinates climateprediction.net a huge supercomputing project for the study of climate change. "We want to simulate extreme weather events, but they need much computing power than any current supercomputer could tackle them alone," he says. A lack of it, requested airtime on multiple computers around the world, but still not get the desired power calculation. hopeful the project is displayed as his colleague Francisco Doblas, ICREA professor and director of the Department of Earth Sciences of the BSC.

"The big difference when the new computer is used will be the spatial resolution with which we do our simulations at the end of the century, but not only, also for us to predict phenomena such as El Niñe year so far" Doblas points. It is getting the robot portrait of Earth's climate many years away. And, continuing the analogy with a photograph, current models of that future image of the Earth are about 50 kilometers pixel side. "In 2025, with new computers, the resolution could reach only 1 kilometer," said the researcher. To anticipate how it will evolve El Niño until December these researchers need a computer as MareNostrum, with 40,000 processors, work 24 hours a day and a week.

We begin to understand how the dynamics of the oceans, the polar ice and other systems, but now we need to combine their data to know how influence each other " The film of the evolution of the global climate by the end of century demands, the longer the calculation time yet. MareNostrum processors employing 2,000 full-time calculations need six months, exemplifies Doblas. Getting simulate how it evolves a heat wave like the present, an extreme weather event, performing 10,000 simulations need to 1 kilometer resolution, "unthinkable" now.

As is the case with the combination of the millions of molecules and of personal genetic profiles of patients, in anticipation of global climate change must take into account many variables. "We begin to understand how the dynamics of the oceans, the polar ice and other systems, but now we need to combine their data to know how influence each other and how does climate change on a small scale, on specific areas Earth, "explains Doblas.

The aim is to confirm theories: "We want to understand the biophysical processes in climate, land use, interaction with ocean systems with aerosols that are deposited on the surface of the sea, the evolution of sea ice around Antarctica" aims to mere way of example. To Doblas, the new computer will generate a movie that accurately reflect the future, and will have "more pixels, more characters and more colors" that he and his colleagues are now able to create.

Airtime
Unlike the United States, in Europe the cost of using supercomputers usually fall in the body that manages, it provided the purpose is of public research. Is charged in the case of private companies, in addition to the electricity consumed, the labor cost of computer operators and, in any case, the depreciation of equipment. Projects are selected by a technical committee and another scientist, in the interest of the project, granting hours of computer use.

Artikel Terkait