Policies

From NU HPC Wiki
Revision as of 15:23, 25 April 2024 by Admin (talk | contribs) (Minors edit)
Jump to navigation Jump to search

Policies

Note

Software configurations on NU HPC facilities are updated on a continuous basis. The limits on job execution and maximum storage allocations are subject to change based on decisions made by the NU HPC Committee and actual system utilization.

Acceptable Use

The HPC system is a unique resource for NU researchers and the community. It has special characteristics, such as a large amount of RAM and the capability for massive parallelism. Due to its uniqueness and expense, its use is supervised by the HPC team to ensure fair and efficient utilization.

Storage quotas

Current default storage quota for users’ home directories is set to 100 GB. If any individual user requires more storage for his/her work, it can be allocated through a special request to the HPC admins. For particularly large, multi-terabyte storage needs Shabyt has an HDD array with the total capacity of 120 TB.

Data backup

Please be advised that users take full responsibility for the integrity and safety of their data stored on Shabyt. While Shabyt features enterprise level hardware, failures are still a possibility, especially given no redundancy in the underlying storage systems that are designed for high throughput. Currently Shabyt does not have any automatic backup of users’ data. While this may change in the future as the HPC team continues to configure the system, at this time users are strongly advised to perform regular backup of any important data they may have stored in their home directories.

Queues and the number of jobs

Currently, Shabyt has two partitions for user jobs. While at this time, when the system is still being configured and fine-tuned, there is no hardcoded limit on the number of jobs by any individual user, it will likely change in the near future.

Acknowledgment

If the computational resources provided by NU HPC facilities were an important asset in your work resulting in a publication, we will greatly appreciate your acknowledgment.