Tagged
hpc
14 documents tagged with "hpc"
Accessing Arseven
How to request an account on the Arseven HPC cluster for the Texas A&M Department of Statistics — faculty, staff, students, and sponsored collaborators.
docsArseven Hardware
Detailed hardware specifications for the Arseven HPC cluster — login, CPU, GPU, and file server nodes.
docsArseven HPC Cluster
The Arseven HPC cluster supports research computing for the Texas A&M Department of Statistics. 1,536 cores across 12 nodes with NVIDIA A30 GPUs.
docsArseven Acceptable Use Policy
Acceptable use policy, data classification rules, and shared-resource expectations for the Arseven HPC cluster.
docsArseven User Quotas
Storage quotas on the Arseven HPC cluster and how to check your current usage with the showquota command.
docsArseven SLURM Partitions
Available SLURM partitions on the Arseven HPC cluster with default and maximum time limits.
docsUsing JupyterLab on Arseven
Run JupyterLab interactively on the Arseven HPC cluster via SSH tunnel and local web browser.
docsUsing R on Arseven
Load and use R on the Arseven HPC cluster via the module system.
docsArseven User Directories
Home, scratch, and tmp directory layout on the Arseven HPC cluster.
docsConnecting to HPC Systems
Connect to TAMU HPC clusters via SSH from Windows (MobaXTerm), Mac (Terminal + XQuartz), or Linux.
docsResearch Computing
High Performance Computing (HPC) clusters, shared research infrastructure, and Slurm job scheduling for Texas A&M research users.
docsOrchard HPC Cluster
The Orchard HPC cluster for the Texas A&M NUEN (Nuclear Engineering) Department, hosted on engineering infrastructure.
docsRSICC Software Requests
How to request access to Oak Ridge National Labs RSICC-licensed codes (MCNP, SCALE, RELAP) on the Orchard HPC cluster.
docsSubmitting Jobs with Slurm
Write batch submission scripts, run interactive sessions, and understand common Slurm parameters for HPC job submission at Texas A&M.