A Blog by Jonathan Low

 

May 31, 2020

How 437 Petaflops of Computing Power Speed Covid-19 Research

The primary advantage is the processing of vast amounts of data quickly, which generates faster insights about how to prevent the spread and aid effective treatment while vaccine development - to which it is also contributing, proceeds. JL

Kyle Wiggers reports in Venture Beat:

Powerful computers allow researchers to undertake large numbers of calculations in epidemiology, bioinformatics, and molecular modeling which would take months on traditional computing platforms. The insights generated help advance viral-human interaction, viral structure and function, molecule design, drug repurposing, and patient outcomes. Scientists (can) simulate how 8,000 different molecules interact with COVID-19, resulting in the isolation of 77 compounds likely to render the virus unable to infect host cells (and) process hundreds of images generated by computed tomography (to) give diagnoses in seconds.
n March, IBM announced alongside the White House Office of Science and Technology Policy that it would help coordinate an effort to provide hundreds of petaflops of compute to scientists researching the coronavirus. As part of the newly launched COVID-19 High Performance Computing (HPC) Consortium, the company pledged to assist in evaluating proposals and to provide access to resources for projects that “make the most immediate impact.”
Almost two months later, IBM claims those efforts are beginning to bear fruit.
This week, IBM announced that the UK Research and Innovation (UKRI) and Swiss National Supercomputer Center (CSCS) will join the COVID-19 HPC Consortium, making available machines including the University of Edinburgh’s ARCHER; the Science and Technology Facilities Council’s DIRAC; the Biotechnology and Biological Sciences Research Council’s Earlham Institute; and Piz Daint, the sixth-ranked supercomputer in the world according to the TOP 500 supercomputing list. With the new additions, scientists can take advantage of over 437 petaflops (430 trillion floating-point operations per second) of compute power — up from 330 petaflops in mid-March — across hardware owned and operated by the Consortium’s 40 partners.
IBM says more than 59 projects in the U.S., Germany, India, South Africa, Saudi Arabia, Croatia, Spain, the U.K., and other countries have been matched with supercomputers from Google Cloud, Amazon Web Services, Microsoft Azure, IBM, and dozens of academic and nonprofit research institutions for free. (Normally, a petaflop of computing power costs between $2 million and $3 million, according to IBM.) Collectively, they’re running on over 113,000 nodes containing 4.2 million processor cores and more than 43,000 graphics cards.
Powerful computers allow researchers to undertake large numbers of calculations in epidemiology, bioinformatics, and molecular modeling, many of which would take months on traditional computing platforms (or year if worked by hand). The insights generated by the experiments can help advance humanity’s understanding of COVID-19 in key ways, such as viral-human interaction, viral structure and function, small molecule design, drug repurposing, and patient trajectory and outcomes. 
Researchers affiliated with the University of Utah tapped the Consortium’s compute to generate more than 2,000 molecular models of compounds relevant for COVID-19 and rank them based on force field energy estimates, which they theorize could help design better peptide inhibitors of a novel coronavirus-stopping enzyme. Meanwhile, IBM Research Europe partnered with scientists at the University of Oxford to combine molecular simulations with AI to discover potential compounds that could be repurposed as anti-COVID-19 drugs. A pair of NASA researchers are working to define risk groups by performing genome analysis on COVID-19 patients who develop acute respiratory distress syndrome. Not to be outdone, scientists at Utah State University intend to leverage the Consortium’s supercomputers to study the transmission of airborne respiratory infectious diseases like COVID-19.
IBM COVID-19 HPC Consortium
Above: A supercomputer rendering of airflow in an empty hospital room, where the vertical velocity of air is high enough to keep the virus-laden aerosols in suspension. This will help tell the regions in the room that might have aerosols of certain size in the air column.
Image Credit: IBM
On the private sector side of the equation, Kolkata-based Novel Techsciences hopes to use the Consortium’s resources to identify phytochemicals from India’s over 3,000 medicinal plants and anti-viral plant extracts that might act as natural drugs against COVID-19. London-based AI chemistry startup PostEra’s Moonshot Project has already isolated around 21 designs that target a key protein associated with coronavirus. Another effort spearheaded by AI startup Kuano, which is also based in London, aims to glean info about diseases akin to COVID-19 to pioneer a new drug that could defeat coronavirus, while Germany-based Innoplexus hopes to discover molecules that could combat COVID-19.
Today’s update follows news that scientists tapped IBM’s Summit at Oak Ridge National Laboratory, the world’s fastest supercomputer, to simulate how 8,000 different molecules would interact with COVID-19, resulting in the isolation of 77 compounds likely to render the virus unable to infect host cells. Elsewhere, the Tianhe-1 supercomputer at the National Supercomputer Center in Tianjin was recently used to process hundreds of images generated by computed tomography and give diagnoses in seconds. And the Gauss Center for Supercomputing, an alliance of Germany’s three national supercomputing centers, said it would help those working on COVID-19 research gain expedited access to computing resources.
More recently, Folding@home and Rosetta@home, two of the largest crowdsourced supercomputing programs in the world, kick-started initiatives to uncover the mysteries behind COVID-19’s spike protein, which the virus uses to infect cells. Since the COVID-19 focus was announced earlier this year, hundreds of thousands of volunteers have joined the effort. As of March, the Folding@home network alone has been running with over an exaflop — one quintillion floating-point computations per second — of distributed computational performance, exceeding the performance of any of the Consortium’s machines.
Supercomputers have been used long before the pandemic to identify and test treatments for complex and chronic diseases. Researchers tapped the Texas Advanced Computing Center’s Lonestar5 cluster to simulate over 1,400 FDA-approved drugs to see if they could be used to treat cancer, for instance. Eight supercomputing centers were last year selected across the EU to host applications in personalized medicine and drug design. More recently, pharmaceutical company TwoXAR recently teamed up with the Asian Liver Center at Stanford to screen 25,000 drug candidates for adult liver cancer.

0 comments:

Post a Comment