Advanced Research Computing Center

UW Advanced Research Computing Center (ARCC) is the primary research computing facility for the University of Wyoming and we are housed within the Division of Research and Economic Development. We deploy and maintain in-house scientific computing resources including high performance computing clusters, high-speed research storage, and host specialized services.  Our center strives to support and enhance the University's research mission, and can serve as a gateway to other research entities.

If you are new to ARCC and high performance computing, we recommend beginning with our 
Getting Started guide.  Users can stay up to date with the latest news and announcements by joining our mailing list.  

If you'd like to use our resources, click the button below to request a new project or e-mail us at arcc-help@uwyo.edu with any questions.

 

Request a project Join our Mailing List

 

AI Brain Image


 

mountain icon

Highlights from ARCC Researchers

Read the latest about success stories from our users

WyGISC Collaboration provides Hyper Screen Sharing to Campus & Beyond

Read the full story here

Researchers at the School of Computing’s Wyoming Geographic Science Center (WyGISC) worked with colleagues at the Research and Economic Development’s Advanced Research Computing Center (ARCC) to provide SAGE3. “We have partnered with ARCC on various other successful projects in the past to leverage their systems and expertise such as Pathfinder and Alcova, it just made sense to for us reach out to them on this one as well,” said Nicholas Case, a Geospatial Developer at WyGISC. 

Case demonstrating ARCC’s implementation of SAGE3 in the Data-X Studio

 

SAGE3(https://sage3.sagecommons.org/) is an open-source platform for collaboration similar to a jamboard but with much more features funded by the National Science Foundation (NSF). It also complements video meeting tools like Zoom and Microsoft Teams and integrates Artificial Intelligence. More specifically, SAGE3 provides researchers and students alike with the ability to create collaboration boards that can then add several different applications and widgets for coding, visualizations, AI chat, and more for whatever project they are working on together. 

A Dance of Two Stars & A Million CPUs

Read the full story here

By working together, a researcher at the University of Wyoming (UWyo) and the Advanced Research Computing Center (ARCC), over a year and a half’s worth of computational work was completed in less than three weeks. 
 

Illustration of a Binary Star System from Britanica.com

 

Binary star systems (two stars gravitationally bound in orbit around each other) are one of the many phenomena found in our galaxy that seem from science fiction. Tatooine anyone? However, they are quite common in our very own Milky Way. That said, much research is still needed to classify eclipsing binary systems. One such researcher, Megan Frank, a PhD candidate in UWyo’s Department of Physics and Astronomy is attempting to do just that. As Megan describes her project, “this project aims to classify four eclipsing binary systems located in the Large Magellanic Cloud (a dwarf galaxy of the Milky way). These objects are a bit unique in that they showcase a unique reflection effect in their light curves, meaning that the space between the primary and secondary eclipse is sloped rather than flat.”

 

 

Documentation and Training Materials

ARCC offers a large selection of documentation and on-demand training available to users.  Click the button below to view our training and documentation index.

ARCC Documentation Index

 

a model of the human brain made up of nodes in a neural network used to depict artificial intelligence

Why use HPC?

High-Performance Computing (HPC) references using powerful computers to perform complex calculations and simulations that are typically difficult or impossible using a conventional desktop computer.  HPC can offer users significant benefits in many applications.  While not necessary for all projects, it is a very valuable tool for researchers who want to spend less time waiting on computations or simulations. 

Improved Speed and Efficiency
HPC allows you to perform calculations and simulations much faster than you can on a regular desktop computer or laptop. This lets you complete tasks quickly, save time and money, and make more informed decisions.
Handling for Large Datasets
HPC can process and analyze large amounts of data more easily than conventional computers, which is vital in many fields, including data analytics, machine learning, and artificial intelligence.
Flexibility and Scalability
Users can scale to much larger problem sizes and configurations when running on HPC. HPC resources now also include purpose-built accelerator technologies such as GPUs for modern compute-intensive workloads.