UW Again Boosts High

August 14, 2013
Man standing near computer servers
Tim Brewer, UW’s end user support manager for information technology, stands between two rows of computer racks that make up Mount Moran. The row on the right was recently added, more than doubling the processing capacity of the high-performance computing facility on campus.

When University of Wyoming faculty members return to campus this fall, they will find Mount Moran has significantly boosted its capacity for high-performance computing. More than double, in fact.

“We have 112 new nodes, bringing us to 194 in total and 93.92 Teraflops (counting the graphic processing units),” says Tim Brewer, UW’s end user support manager for information technology. “This puts us just under the top 500 (high-performance computing clusters in terms of capacity) in the U.S.”

A node is conceptually similar to a desktop computer, while a Teraflop is a measure of a computer’s speed and is equivalent to a trillion floating point operations per second.

The high-performance computing cluster, nicknamed “Mount Moran” after a mountain peak in western Wyoming’s Tetons, and a large-scale storage system make up UW’s Advanced Research Computing Center (ARCC).

The campus cluster, which became available for use in November 2012 and has been fully operational since February, serves two purposes. One, it enables atmospheric and earth sciences faculty -- who will be able to use the NCAR-Wyoming Supercomputing Center (NWSC)  -- to learn what to expect with their software. The cluster provides the opportunity for that group of faculty to work out issues caused by scaling up parallel algorithms from tens or hundreds of processors to thousands of processors, before moving up to tens of thousands of processors on the NWSC.

Two, the cluster provides a research resource for any UW research faculty -- such as bioinformaticists, social scientists, pure mathematicians and theoretical physicists -- who have a complex problem or whose research doesn’t fall within the scope of the NWSC.

Currently, 115 University of Wyoming faculty members, collaborators, students and post-doctoral researchers use the high-performance computing center for their research, Brewer says.

“I definitely expected this to be used right away, and it has been,” says Tim Kuhfuss, UW’s director of research support for UW’s Information Technology Center.

Power for Water

A big reason for the recent boost in computing power is Fred Ogden. The Cline Distinguished Chair in the Department of Civil and Architectural Engineering and Haub School of Environment and Natural Resources recently purchased a large block of nodes to further his CI-Water project research.

The CI-WATER project will develop a high-resolution, physics-based hydrologic model that is applicable over large areas to help assess long-term impacts of water resources management decisions, natural and man-made land-use changes, and climate variability -- with an emphasis on the Rocky Mountain West region.

Through computer simulations, Ogden and other researchers want to determine how much water is available in the Colorado River Basin, including Lake Powell, which straddles the Arizona and Utah border.

Ogden says he purchased 48 nodes for the CI-WATER project last year, and another 90 nodes in May. In all, about $1 million was spent on computer nodes for the project, Ogden says. The CI-WATER project cost share of the total machine is reduced when factoring in that UW covers the costs of racks, cabling and cooling, he adds.

“We are using Mount Moran for code development and smaller scale runs and data analysis,” Ogden says. “Once our model is scaled up to larger watersheds, we will do most of our simulations on Yellowstone.”

Yellowstone is the nickname for the NWSC.

When the CI-WATER project is not using Mount Moran, the resource is available for others on campus to use, Ogden says. The CI-WATER project partners in Utah also have been running computations on Mount Moran.

“We wrote the CI-WATER proposal, in part, to improve on-campus access to HPC resources,” Ogden explains. “Yellowstone is a production machine that will ultimately be full, giving long wait times for runs. We saw the need for an on-campus research machine that will provide a more responsive environment for parallel code development, testing and small-scale research.”

A Mountain of Research

Unlike the NWSC, UW faculty members do not apply to use core hours on Mount Moran. Rather, they apply for access to Mount Moran by filling out a request form at https://arcc.uwyo.edu/content/arcc-access-request-form

The cluster, which is available 24-7, operates as a “condominium model,” meaning that the university provides the basic infrastructure -- personnel to run it, high-speed networking and the core computer architecture to keep it running. In exchange, UW researchers buy computing nodes and/or storage. That investment comes from faculty securing successful grant proposals, which are expected to include requests for funding for the computational resources needed for their particular research projects.

The university also offers some communal nodes to faculty who do not purchase computer nodes. All nodes, communal and purchased, are available to all users, but communal users receive priority access to communal resources, and invested users receive priority access to their nodes.

Jeff Clune, a UW assistant professor of computer science, is one faculty member who uses the communal nodes extensively for his evolutionary computational research. Many students from his Evolving Artificial Intelligence Lab also use Mount Moran for their artificial intelligence research, which includes artificially intelligent robots.

“My lab has infinite computational needs, so we graciously use whatever is available,” Clune says. “Mount Moran is critical to the science my lab conducts. Without it, we would not be able to conduct any meaningful experiments. With it, we have already made many important discoveries that are improving our ability to understand how intelligence evolved so we can recreate that process to produce intelligent robots.”

Other UW faculty, such as Dimitri Mavriplis and Jay Sitaraman, both in mechanical engineering, used Mount Moran to test their computational fluid dynamics research before moving up in scale on the Cheyenne supercomputer, Brewer says.

Continued Growth Expected

While there are now nearly two full rows of nodes in ARCC, Brewer says there’s still room to grow, pointing to some empty racks in the cabinet rows and some open space in the large computing room. Brewer expects the next phase of node purchase will occur in December.

Kuhfuss agrees.

“From what I can see, it (Mount Moran) is going to continue to grow over the next three years,” Kuhfuss says. “Even though we’ve doubled the capacity, it will be used very quickly. I fully expect future contributions (faculty purchase of more nodes) to make it grow even more.”

Find us on Instagram (Link opens a new window)Find us on Facebook (Link opens a new window)Find us on Twitter (Link opens a new window)Find us on LinkedIn (Link opens a new window)Find us on YouTube (Link opens a new window)