


It is complemented by an Ethernet-based network defined for management tasks, external access and interactions with user's applications that do not support Infiniband natively. The main high-bandwidth low-latency network of the operated facility relies on the dominant interconnect technology in the HPC market i.e.,Infiniband (IB) over a Fat-tree topology. The University of Luxembourg operates since 2007 a large academic HPC facility which remains one of the reference implementation within the country and offers a cutting-edge research infrastructure to Luxembourg public research. In practice, this last component and the associated topology remains the most significant differentiators between HPC systems and lesser performant systems. The execution time of a given simulation depends upon many factors, such as the number of CPU/GPU cores, their utilisation factor and, of course, the interconnect performance, efficiency, and scalability. High Performance Computing (HPC) encompasses advanced computation over parallel processing.
