MIT joins White House supercomputing effort to speed up search for Covid-19 solutions

The White House has announced the launch of the Covid-19 High Performance Computing Consortium, a collaboration among various industry, government, and academic institutions which will aim to make their supercomputing resources available to the wider research community, in an effort to speed up the search for solutions to the evolving Covid-19 pandemic.

MIT has joined the consortium, which is led by the U.S. Department of Energy, the National Science Foundation, and NASA.

MIT News spoke with Christopher Hill, principal research scientist in MIT’s Department of Earth, Atmospheric, and Planetary Sciences, who is serving on the new consortium’s steering committee, about how MIT’s computing power will aid in the fight against Covid-19.

Q: How did MIT become a part of this consortium? 

A: IBM, which has longstanding computing relationships with both the government and MIT, approached the Institute late last week about joining. The Department of Energy owns IBM’s Summit supercomputer, located at Oak Ridge National Laboratory, which was already working on finding pharmaceutical compounds that might be effective against this coronavirus. In addition to its close working relationship with MIT, IBM also had donated the Satori supercomputer as part of the launch of the MIT Schwarzman College of Computing. We obviously want to do everything we can to help combat this pandemic, so we jumped at the chance to be part of a larger effort.

Q: What is MIT bringing to the consortium?

A: We’re primarily bringing two systems to the effort: Satori and Supercloud, which is an unclassified system run by Lincoln Laboratory. Both systems have very large numbers of the computing units — known as GPUs — that enable the machines to process information far more quickly, and they also have extra large memory. That makes the systems slightly different from other machines in the consortium in ways that may be helpful for some types of problems.

For example, MIT’s two systems seem to be especially helpful at examining images from cryo-electron microscopy, which entails use of an electron microscope on materials at ultralow temperatures. Ultralow temperatures slow the motion of atoms, making the images clearer. In addition to the hardware, MIT faculty and staff have already expressed interest in assisting outside researchers who are using MIT equipment.

Q: How will MIT operate as part of the consortium?   

A:  The consortium will receive proposals through a single portal being run in conjunction with the NSF. A steering committee will decide which proposals are accepted and where to route them.  The steering committee will be relying on guidance from a larger technical review committee, which will include the steering committee members and additional experts. Both committees are made of researchers from the participating institutions. I will serve on both committees for MIT, and we’ll be appointing a second person to serve on the technical review committee. 

Four individuals at MIT — Ben Forget, Nick Roy, Jeremy Kepner (Lincoln Lab), and myself ­— will oversee the work at the Institute. The goal of the consortium is to focus on projects where computing is likely to produce relevant advances in one week to three months —though some projects, like those related to vaccines — may take longer.

Proudly powered by WordPress
Theme: Esquire by Matthew Buchanan.