Department of Mathematics & Computer Science
Documentation by M&CS
This section of the documentation is maintained by M&CS. For suggestions to these pages, please contact Hub MetaForum.
The Department of Mathematics & Computer Science offers several compute solutions. These are part of the catalogue of solutions that the department & LIS offer to M&CS researchers and educators.
Solutions
The TU/e adapts a staircase model of solutions. Here, we will discuss the solutions that are specific to M&CS (step 2) that fit this model. Management of these resources is done by LIS, but the full capacity is available to M&CS. To make efficient use of the available resources, these solutions are shared department-wide. Policies for usage are set by the department board based on input from M&CS ICT Operations. To see what solution would fit your requirements, please contact Research IT.
Solution | HPC: mcs.default.q (umbrella cluster) | HPC: mcs.gpu.q (umbrella cluster) | mastodont | Virtual machines | IT-Lab |
---|---|---|---|---|---|
Compute type | Job-based (Slurm) CPU-bound | Job-based (Slurm) GPU-bound | Long-running non-parallel jobs RAM-bound | Continuous loads Experimental use Proof-of-concept | Hardware housing for experimental use |
Specifications | See here | See here Max. 2 GPUs/user simultaneously | 3TB RAM 12 TB swap | Maximum (after approval): 4 cpu cores 16GB RAM 250GB storage¹ | Research & education servers or devices that cannot be placed in the HTC datacenter (e.g. form factor, frequent access needed) |
Operating system | Linux (details here) | Linux (details here) | Linux (Ubuntu Server 20.04) | Ubuntu Server LTS Windows Server other (when required) | Ubuntu Server LTS Windows Server other (when required) |
Funded by | M&CS | M&CS | M&CS | M&CS | t.b.d. |
Research | • Small to medium-scale compute needs • Unfunded research possible • EngD/PhD candidates | • Small-scale GPU needs • Unfunded research possible • EngD/PhD candidates | • RAM-bound compute needs | • Small-scale VMs • EngD/PhD candidates | |
Education | • BSc/MSc seminar or thesis projects • Not suitable for large-scale courses (capacity constraints) | • BSc/MSc seminar or thesis projects • Not suitable for large-scale courses (capacity constraints) | • MSc thesis projects • Not suitable for large-scale courses | • Software engineering project • BSc/MSc seminar or thesis projects • Other courses depending on capacity (scale-up/scale-down depending on course timeframe) | |
Hardware management | LIS (Supercomputing Center) | LIS (Supercomputing Center) | M&CS | LIS (CSS) | User |
Software management | LIS (Supercomputing Center) | LIS (Supercomputing Center) | Installation after user request, best-effort: LIS (CSS) | User Installation on user request, best-effort: Hub MetaForum & Lab Manager | User |
Support level | Best-effort by Supercomputing Center | Best-effort by Supercomputing Center | Best-effort by Hub MetaForum | Best-effort by Hub MetaForum & Lab Manager | Best-effort by Hub MetaForum & Lab Manager |
Request access through | TOPdesk SSP | TOPdesk SSP | Hub MetaForum | Hub MetaForum or TOPdesk SSP (please specify that this is for MCS-ESX) | Lab Manager |
Get access timeline | 2 working days after approval | 2 working days after approval | 5 working days after approval | 5 working days after approval | t.b.d. |
Next steps | Snellius Rescale | Snellius Rescale | Snellius | SURF Research Cloud Azure | TU/e datacenter HTC (production machines) Virtual alternatives |
More information | M&CS HPC | M&CS HPC | M&CS mastodont | M&CS Virtual Machines | M&CS IT Lab |
Contact | Research IT or Supercomputing Center | Research IT or Supercomputing Center | prof.dr.ir. J.F. Groote, key user (FSA) | Tijs Poort, Lab Manager | Tijs Poort, Lab Manager |
¹ More details on the product page
LIS CSS: product team Compute and Storage Services (product area Platforms)
LIS SCC: product team Supercomputing Center (product area Research)