Test Report : results

Test Suite: InstallTests.computes_installed-20241103185709

Results

Duration1.062 sec
Tests5
Failures0

Tests

InstallTests.computes_installed

Test case:test_01_numcomputes_greater_than_zero
Outcome:Passed
Duration:0.0 sec
FailedNone
None
Test case:test_02_koomie_cf_available
Outcome:Passed
Duration:0.0 sec
FailedNone
None
Test case:test_03_nonzero_results_from_uptime
Outcome:Passed
Duration:1.061 sec
FailedNone
None
Test case:test_04_correct_number_of_hosts_booted
Outcome:Passed
Duration:0.0 sec
FailedNone
None
Test case:test_05_verify_boot_times_are_reasonable
Outcome:Passed
Duration:0.0 sec
FailedNone
None

Test Suite: sms_installed.bats

Results

Duration0.052 sec
Tests2
Failures0

Tests

sms_installed.bats

Test case:Verify hostname matches expectations
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:Base OS check
Outcome:Passed
Duration:0.03 sec
FailedNone
None

Test Suite: clustershell

Results

Duration0.968 sec
Tests2
Failures0

Tests

clustershell

Test case:[clush] check for OS-provided RPM
Outcome:Passed
Duration:0.049 sec
FailedNone
None
Test case:[clush] clush -Sg compute
Outcome:Passed
Duration:0.919 sec
FailedNone
None

Test Suite: conman

Results

Duration0.079 sec
Tests2
Failures0

Tests

conman

Test case:[conman] check for RPM
Outcome:Passed
Duration:0.048 sec
FailedNone
None
Test case:[conman] query conmand conman
Outcome:Passed
Duration:0.031 sec
FailedNone
None

Test Suite: genders

Results

Duration0.078 sec
Tests2
Failures0

Tests

genders

Test case:[genders] check for RPM
Outcome:Passed
Duration:0.048 sec
FailedNone
None
Test case:[genders] check node attributes
Outcome:Passed
Duration:0.03 sec
FailedNone
None

Test Suite: nhc

Results

Duration5.439 sec
Tests3
Failures0

Tests

nhc

Test case:[nhc] check for RPM
Outcome:Passed
Duration:0.049 sec
FailedNone
None
Test case:[nhc] generate config file
Outcome:Passed
Duration:2.699 sec
FailedNone
None
Test case:[nhc] service failure detection and restart
Outcome:Passed
Duration:2.691 sec
FailedNone
None

Test Suite: slurm-plugins

Results

Duration1.288 sec
Tests4
Failures0

Tests

slurm-plugins

Test case:[slurm] check for jobcomp_elasticsearch plugin
Outcome:Passed
Duration:0.029 sec
FailedNone
None
Test case:[slurm] check for job_submit_lua plugin
Outcome:Passed
Duration:0.029 sec
FailedNone
None
Test case:[slurm] check for --x11 option
Outcome:Passed
Duration:0.039 sec
FailedNone
None
Test case:[slurm] check for sview rpm availability
Outcome:Passed
Duration:1.191 sec
FailedNone
None

Test Suite: lmod

Results

Duration0.42 sec
Tests1
Failures0

Tests

lmod

Test case:[lmod] test that the setup function passed
Outcome:Passed
Duration:0.42 sec
FailedNone
None

Test Suite: spack

Results

Duration58.424 sec
Tests4
Failures0

Tests

spack

Test case:[spack] check for RPM
Outcome:Passed
Duration:0.547 sec
FailedNone
None
Test case:[spack] add compiler
Outcome:Passed
Duration:1.874 sec
FailedNone
None
Test case:[spack] test build
Outcome:Passed
Duration:54.698 sec
FailedNone
None
Test case:[spack] test module refresh
Outcome:Passed
Duration:1.305 sec
FailedNone
None

Test Suite: build

Results

Duration6.484 sec
Tests1
Failures0

Tests

build

Test case:[/apps/hpcg] build HPCG executable (gnu14/mpich)
Outcome:Passed
Duration:6.484 sec
FailedNone
None

Test Suite: rm_execution_multi_host

Results

Duration16.138 sec
Tests3
Failures0

Tests

rm_execution_multi_host

Test case:[/apps/hpcg] check if resource manager is defined
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[/apps/hpcg] run HPCG on multi nodes under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:16.062 sec
FailedNone
None
Test case:[/apps/hpcg] log HPCG multi node results (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.057 sec
FailedNone
None

Test Suite: rm_execution_single_host

Results

Duration12.948 sec
Tests3
Failures0

Tests

rm_execution_single_host

Test case:[/apps/hpcg] check if resource manager is defined
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[/apps/hpcg] run HPCG on single node under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:12.872 sec
FailedNone
None
Test case:[/apps/hpcg] log HPCG single node results (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.057 sec
FailedNone
None

Test Suite: build

Results

Duration4.471 sec
Tests1
Failures0

Tests

build

Test case:[/apps/hpcg] build HPCG executable (gnu14/openmpi5)
Outcome:Passed
Duration:4.471 sec
FailedNone
None

Test Suite: rm_execution_multi_host

Results

Duration17.258 sec
Tests3
Failures0

Tests

rm_execution_multi_host

Test case:[/apps/hpcg] check if resource manager is defined
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[/apps/hpcg] run HPCG on multi nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:17.181 sec
FailedNone
None
Test case:[/apps/hpcg] log HPCG multi node results (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.058 sec
FailedNone
None

Test Suite: rm_execution_single_host

Results

Duration16.183 sec
Tests3
Failures0

Tests

rm_execution_single_host

Test case:[/apps/hpcg] check if resource manager is defined
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[/apps/hpcg] run HPCG on single node under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:16.104 sec
FailedNone
None
Test case:[/apps/hpcg] log HPCG single node results (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.06 sec
FailedNone
None

Test Suite: build

Results

Duration6.189 sec
Tests1
Failures0

Tests

build

Test case:[/apps/hpcg] build HPCG executable (gnu14/mvapich2)
Outcome:Passed
Duration:6.189 sec
FailedNone
None

Test Suite: rm_execution_multi_host

Results

Duration25.677 sec
Tests3
Failures0

Tests

rm_execution_multi_host

Test case:[/apps/hpcg] check if resource manager is defined
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[/apps/hpcg] run HPCG on multi nodes under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:25.602 sec
FailedNone
None
Test case:[/apps/hpcg] log HPCG multi node results (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.056 sec
FailedNone
None

Test Suite: rm_execution_single_host

Results

Duration14.001 sec
Tests3
Failures0

Tests

rm_execution_single_host

Test case:[/apps/hpcg] check if resource manager is defined
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[/apps/hpcg] run HPCG on single node under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:13.926 sec
FailedNone
None
Test case:[/apps/hpcg] log HPCG single node results (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.056 sec
FailedNone
None

Test Suite: build

Results

Duration9.888 sec
Tests1
Failures0

Tests

build

Test case:[Apps/miniFE] build MiniFE executable (gnu14/mpich)
Outcome:Passed
Duration:9.888 sec
FailedNone
None

Test Suite: rm_execution_multi_host

Results

Duration15.062 sec
Tests2
Failures0

Tests

rm_execution_multi_host

Test case:[Apps/miniFE] run miniFE on multi nodes under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:15.01 sec
FailedNone
None
Test case:[Apps/miniFE] log miniFE multi node results (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.052 sec
FailedNone
None

Test Suite: rm_execution_single_host

Results

Duration3.375 sec
Tests2
Failures0

Tests

rm_execution_single_host

Test case:[Apps/miniFE] run miniFE on single node under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.322 sec
FailedNone
None
Test case:[Apps/miniFE] log miniFE single node results (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.053 sec
FailedNone
None

Test Suite: build

Results

Duration7.635 sec
Tests1
Failures0

Tests

build

Test case:[Apps/miniFE] build MiniFE executable (gnu14/openmpi5)
Outcome:Passed
Duration:7.635 sec
FailedNone
None

Test Suite: rm_execution_multi_host

Results

Duration15.101 sec
Tests2
Failures0

Tests

rm_execution_multi_host

Test case:[Apps/miniFE] run miniFE on multi nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:15.047 sec
FailedNone
None
Test case:[Apps/miniFE] log miniFE multi node results (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.054 sec
FailedNone
None

Test Suite: rm_execution_single_host

Results

Duration5.513 sec
Tests2
Failures0

Tests

rm_execution_single_host

Test case:[Apps/miniFE] run miniFE on single node under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:5.459 sec
FailedNone
None
Test case:[Apps/miniFE] log miniFE single node results (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.054 sec
FailedNone
None

Test Suite: build

Results

Duration9.519 sec
Tests1
Failures0

Tests

build

Test case:[Apps/miniFE] build MiniFE executable (gnu14/mvapich2)
Outcome:Passed
Duration:9.519 sec
FailedNone
None

Test Suite: rm_execution_multi_host

Results

Duration17.154 sec
Tests2
Failures0

Tests

rm_execution_multi_host

Test case:[Apps/miniFE] run miniFE on multi nodes under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:17.103 sec
FailedNone
None
Test case:[Apps/miniFE] log miniFE multi node results (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.051 sec
FailedNone
None

Test Suite: rm_execution_single_host

Results

Duration2.297 sec
Tests2
Failures0

Tests

rm_execution_single_host

Test case:[Apps/miniFE] run miniFE on single node under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.249 sec
FailedNone
None
Test case:[Apps/miniFE] log miniFE single node results (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.048 sec
FailedNone
None

Test Suite: os_distribution

Results

Duration0.018 sec
Tests1
Failures0

Tests

os_distribution

Test case:[BOS] OS distribution matches (local)
Outcome:Passed
Duration:0.018 sec
FailedNone
None

Test Suite: computes

Results

Duration4.362 sec
Tests4
Failures0

Tests

computes

Test case:[BOS] OS distribution matches (2 active computes)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
disable BOS_RELEASE check
Test case:[BOS] consistent kernel (2 active computes)
Outcome:Passed
Duration:1.095 sec
FailedNone
None
Test case:[BOS] increased locked memory limits
Outcome:Passed
Duration:2.176 sec
FailedNone
None
Test case:[BOS] syslog forwarding
Outcome:Passed
Duration:1.091 sec
FailedNone
None

Test Suite: debugger

Results

Duration0.036 sec
Tests2
Failures0

Tests

debugger

Test case:[Compilers] debugger man page (gnu14)
Outcome:Passed
Duration:0.017 sec
FailedNone
None
Test case:[Compilers] debugger availability (gnu14)
Outcome:Passed
Duration:0.019 sec
FailedNone
None

Test Suite: man_pages

Results

Duration0.13 sec
Tests3
Failures0

Tests

man_pages

Test case:[Compilers] C compiler man/help page (gnu14)
Outcome:Passed
Duration:0.04 sec
FailedNone
None
Test case:[Compilers] C++ compiler man/help page (gnu14)
Outcome:Passed
Duration:0.048 sec
FailedNone
None
Test case:[Compilers] Fortran compiler man/help page (gnu14)
Outcome:Passed
Duration:0.042 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.913 sec
Tests6
Failures0

Tests

rm_execution

Test case:[Compilers] C binary runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.143 sec
FailedNone
None
Test case:[Compilers] C++ binary runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.161 sec
FailedNone
None
Test case:[Compilers] Fortran binary runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.149 sec
FailedNone
None
Test case:[Compilers] C openmp binary runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.147 sec
FailedNone
None
Test case:[Compilers] C++ openmp binary runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.151 sec
FailedNone
None
Test case:[Compilers] Fortran openmp binary runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.162 sec
FailedNone
None

Test Suite: version_match

Results

Duration0.411 sec
Tests3
Failures0

Tests

version_match

Test case:[Compilers] compiler module loaded (gnu14)
Outcome:Passed
Duration:0.12 sec
FailedNone
None
Test case:[Compilers] compiler module version available (gnu14)
Outcome:Passed
Duration:0.121 sec
FailedNone
None
Test case:[Compilers] C, C++, and Fortran versions match module (gnu14)
Outcome:Passed
Duration:0.17 sec
FailedNone
None

Test Suite: test_autotools

Results

Duration5.015 sec
Tests4
Failures0

Tests

test_autotools

Test case:[/dev-tools/autotools] running autoreconf
Outcome:Passed
Duration:3.173 sec
FailedNone
None
Test case:[/dev-tools/autotools] run generated configure
Outcome:Passed
Duration:1.655 sec
FailedNone
None
Test case:[/dev-tools/autotools] run make on generated Makefile
Outcome:Passed
Duration:0.165 sec
FailedNone
None
Test case:[/dev-tools/autotools] run compiled binary
Outcome:Passed
Duration:0.022 sec
FailedNone
None

Test Suite: test_cmake

Results

Duration1.983 sec
Tests4
Failures0

Tests

test_cmake

Test case:[/dev-tools/cmake] running cmake --system-information
Outcome:Passed
Duration:0.764 sec
FailedNone
None
Test case:[/dev-tools/cmake] run cmake
Outcome:Passed
Duration:0.392 sec
FailedNone
None
Test case:[/dev-tools/cmake] run make on generated Makefile
Outcome:Passed
Duration:0.805 sec
FailedNone
None
Test case:[/dev-tools/cmake] run compiled binary
Outcome:Passed
Duration:0.022 sec
FailedNone
None

Test Suite: EasyBuild

Results

Duration10.812 sec
Tests3
Failures0

Tests

EasyBuild

Test case:[EasyBuild] check for RPM
Outcome:Passed
Duration:0.086 sec
FailedNone
None
Test case:[EasyBuild] test executable
Outcome:Passed
Duration:1.272 sec
FailedNone
None
Test case:[EasyBuild] quick test install of bzip2
Outcome:Passed
Duration:9.454 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.399 sec
Tests2
Failures0

Tests

rm_execution

Test case:[dev-tools/hwloc] lstopo runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.215 sec
FailedNone
None
Test case:[dev-tools/hwloc] hwloc_hello runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.184 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.288 sec
Tests7
Failures0

Tests

test_module

Test case:[HWLOC] Verify HWLOC module is loaded and matches rpm version (gnu14/)
Outcome:Passed
Duration:0.167 sec
FailedNone
None
Test case:[HWLOC] Verify module HWLOC_DIR is defined and exists (gnu14/)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[HWLOC] Verify module HWLOC_LIB is defined and exists (gnu14/)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[HWLOC] Verify dynamic library available in HWLOC_LIB (gnu14/)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[HWLOC] Verify static library is not present in HWLOC_LIB (gnu14/)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[HWLOC] Verify module HWLOC_INC is defined and exists (gnu14/)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[HWLOC] Verify header file is present in HWLOC_INC (gnu14/)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: hipy

Results

Duration2.262 sec
Tests1
Failures0

Tests

hipy

Test case:[dev-tools/py3-mpi4py] python hello world (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.262 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.059 sec
Tests3
Failures0

Tests

test_module

Test case:[mpi4py] Verify module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.018 sec
FailedNone
None
Test case:[mpi4py] Verify module MPI4PY_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[mpi4py] Verify PYTHONPATH is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.019 sec
FailedNone
None

Test Suite: hipy

Results

Duration3.338 sec
Tests1
Failures0

Tests

hipy

Test case:[dev-tools/py3-mpi4py] python hello world (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.338 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.06 sec
Tests3
Failures0

Tests

test_module

Test case:[mpi4py] Verify module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.018 sec
FailedNone
None
Test case:[mpi4py] Verify module MPI4PY_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[mpi4py] Verify PYTHONPATH is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.019 sec
FailedNone
None

Test Suite: hipy

Results

Duration4.402 sec
Tests1
Failures1

Tests

hipy

Test case:[dev-tools/py3-mpi4py] python hello world (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:4.402 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file hipy, line 35)
  `run_mpi_binary "${_python} helloworld.py" $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.11947
Batch job 32 submitted

Job 32 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm python3 helloworld.py 8 (family=mvapich2)
[c2:mpi_rank_4][MPICM_Init_UD_CM] sizeof cm_msg (1296) >= rdma_default_mtu (1024).
[c2:mpi_rank_4][MPICM_Init_UD_CM] Try increasing the MV2_DEFAULT_MTU or reduce MAX_NUM_HCAS, or MAX_NUM_QP_PER_PORT in ibv_param.h.
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)...:
MPID_Init(419)..........: channel initialization failed
MPIDI_CH3_Init(591).....:
MPIDI_CH3I_CM_Init(2052):
MPICM_Init_UD_CM(2092)..:
(unknown)(): Other MPI error

[cli_4]: aborting job:
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)...:
MPID_Init(419)..........: channel initialization failed
MPIDI_CH3_Init(591).....:
MPIDI_CH3I_CM_Init(2052):
MPICM_Init_UD_CM(2092)..:
(unknown)(): Other MPI error

Test Suite: test_module

Results

Duration0.059 sec
Tests3
Failures0

Tests

test_module

Test case:[mpi4py] Verify module is loaded and matches rpm version (gnu14/mvapich2)
Outcome:Passed
Duration:0.018 sec
FailedNone
None
Test case:[mpi4py] Verify module MPI4PY_DIR is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[mpi4py] Verify PYTHONPATH is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.019 sec
FailedNone
None

Test Suite: MM

Results

Duration1.355 sec
Tests1
Failures0

Tests

MM

Test case:[dev-tools/py3-numpy] Numpy Matrix Multiply
Outcome:Passed
Duration:1.355 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.08 sec
Tests4
Failures0

Tests

test_module

Test case:[Numpy] Verify NUMPY modules can be loaded and match rpm version (gnu14)
Outcome:Passed
Duration:0.018 sec
FailedNone
None
Test case:[Numpy] Verify module NUMPY_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[Numpy] Verify module NUMPY_BIN is defined and exists
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[Numpy] Verify PYTHONPATH is defined and exists (gnu14)
Outcome:Passed
Duration:0.019 sec
FailedNone
None

Test Suite: springs

Results

Duration0.361 sec
Tests1
Failures0

Tests

springs

Test case:[dev-tools/py3-scipy] Coupled Spring-Mass System (/gnu14/mpich)
Outcome:Passed
Duration:0.361 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.06 sec
Tests3
Failures0

Tests

test_module

Test case:[scipy] Verify SCIPY module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.018 sec
FailedNone
None
Test case:[scipy] Verify module SCIPY_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[scipy] Verify PYTHONPATH is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: springs

Results

Duration0.356 sec
Tests1
Failures0

Tests

springs

Test case:[dev-tools/py3-scipy] Coupled Spring-Mass System (/gnu14/openmpi5)
Outcome:Passed
Duration:0.356 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.062 sec
Tests3
Failures0

Tests

test_module

Test case:[scipy] Verify SCIPY module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[scipy] Verify module SCIPY_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[scipy] Verify PYTHONPATH is defined and exists (gnu14)
Outcome:Passed
Duration:0.019 sec
FailedNone
None

Test Suite: springs

Results

Duration0.365 sec
Tests1
Failures0

Tests

springs

Test case:[dev-tools/py3-scipy] Coupled Spring-Mass System (/gnu14/mvapich2)
Outcome:Passed
Duration:0.365 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.059 sec
Tests3
Failures0

Tests

test_module

Test case:[scipy] Verify SCIPY module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.018 sec
FailedNone
None
Test case:[scipy] Verify module SCIPY_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[scipy] Verify PYTHONPATH is defined and exists (gnu14)
Outcome:Passed
Duration:0.019 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration2.003 sec
Tests2
Failures0

Tests

rm_execution

Test case:[Valgrind] Callgrind execution under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.486 sec
FailedNone
None
Test case:[Valgrind] Memcheck execution under resource manager (slurm/gnu14)
Outcome:Passed
Duration:1.517 sec
FailedNone
None

Test Suite: test_module

Results

Duration3.426 sec
Tests8
Failures0

Tests

test_module

Test case:[Valgrind] Verify valgrind module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.163 sec
FailedNone
None
Test case:[Valgrind] Verify module VALGRIND_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[Valgrind] Verify availability of valgrind binary (gnu14)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[Valgrind] Verify availability of man page (gnu14)
Outcome:Passed
Duration:0.054 sec
FailedNone
None
Test case:[Valgrind] Verify module VALGRIND_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[Valgrind] Verify header file is present in VALGRIND_INC (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[Valgrind] Callgrind compile/test (gnu14)
Outcome:Passed
Duration:1.524 sec
FailedNone
None
Test case:[Valgrind] Memcheck compile/test (gnu14)
Outcome:Passed
Duration:1.591 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration12.643 sec
Tests1
Failures0

Tests

rm_execution

Test case:[R] Running Rscript bench.R under resource manager (slurm/gnu14)
Outcome:Passed
Duration:12.643 sec
FailedNone
None

Test Suite: test_module

Results

Duration12.448 sec
Tests7
Failures0

Tests

test_module

Test case:[R] Verify R module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.214 sec
FailedNone
None
Test case:[R] Verify R_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[R] Verify availability of R executable -> R (gnu14)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[R] Verify availability of R executable -> Rscript (gnu14)
Outcome:Passed
Duration:0.032 sec
FailedNone
None
Test case:[R] Verify module R_SHARE is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[R] Running bench.R test
Outcome:Passed
Duration:11.572 sec
FailedNone
None
Test case:[R] Verify ability to compile C code for R and execute
Outcome:Passed
Duration:0.554 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration4.544 sec
Tests2
Failures0

Tests

rm_execution

Test case:[libs/ADIOS2] MPI C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.275 sec
FailedNone
None
Test case:[libs/ADIOS2] MPI F90 binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.269 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.332 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/ADIOS2] Verify ADIOS2 module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.201 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_BIN is defined and exists
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify (dynamic) library available in ADIOS2_LIB (gnu14)
Outcome:Passed
Duration:0.026 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify header file is present in ADIOS2_INC (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration3.483 sec
Tests2
Failures0

Tests

rm_execution

Test case:[libs/ADIOS2] MPI C binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.278 sec
FailedNone
None
Test case:[libs/ADIOS2] MPI F90 binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:1.205 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.326 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/ADIOS2] Verify ADIOS2 module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.197 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_BIN is defined and exists
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify (dynamic) library available in ADIOS2_LIB (gnu14)
Outcome:Passed
Duration:0.025 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify header file is present in ADIOS2_INC (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration7.778 sec
Tests2
Failures0

Tests

rm_execution

Test case:[libs/ADIOS2] MPI C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.357 sec
FailedNone
None
Test case:[libs/ADIOS2] MPI F90 binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.421 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.336 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/ADIOS2] Verify ADIOS2 module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.203 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_BIN is defined and exists
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify (dynamic) library available in ADIOS2_LIB (gnu14)
Outcome:Passed
Duration:0.026 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify header file is present in ADIOS2_INC (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.674 sec
Tests4
Failures0

Tests

rm_execution

Test case:[Boost/Accumulators] min-test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.172 sec
FailedNone
None
Test case:[Boost/Accumulators] max-test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.164 sec
FailedNone
None
Test case:[Boost/Accumulators] skewness-test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.166 sec
FailedNone
None
Test case:[Boost/Accumulators] variance-test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.172 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.683 sec
Tests4
Failures0

Tests

rm_execution

Test case:[Boost/Accumulators] min-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.171 sec
FailedNone
None
Test case:[Boost/Accumulators] max-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.169 sec
FailedNone
None
Test case:[Boost/Accumulators] skewness-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.168 sec
FailedNone
None
Test case:[Boost/Accumulators] variance-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.175 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.678 sec
Tests4
Failures0

Tests

rm_execution

Test case:[Boost/Accumulators] min-test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.177 sec
FailedNone
None
Test case:[Boost/Accumulators] max-test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.178 sec
FailedNone
None
Test case:[Boost/Accumulators] skewness-test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.163 sec
FailedNone
None
Test case:[Boost/Accumulators] variance-test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.16 sec
FailedNone
None

Test Suite: test_module

Results

Duration3.293 sec
Tests8
Failures0

Tests

test_module

Test case:[BOOST] Verify BOOST module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.209 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Verify dynamic library available in BOOST_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[BOOST] Verify static library is not present in BOOST_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[BOOST] Verify header file is present in BOOST_INC (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[BOOST] Test interactive build/exec of f_test.cpp (gnu14/mpich)
Outcome:Passed
Duration:2.961 sec
FailedNone
None

Test Suite: test_module

Results

Duration3.375 sec
Tests8
Failures0

Tests

test_module

Test case:[BOOST] Verify BOOST module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.213 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Verify dynamic library available in BOOST_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Verify static library is not present in BOOST_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Verify header file is present in BOOST_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Test interactive build/exec of f_test.cpp (gnu14/openmpi5)
Outcome:Passed
Duration:3.034 sec
FailedNone
None

Test Suite: test_module

Results

Duration3.346 sec
Tests8
Failures0

Tests

test_module

Test case:[BOOST] Verify BOOST module is loaded and matches rpm version (gnu14/mvapich2)
Outcome:Passed
Duration:0.203 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_DIR is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_LIB is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[BOOST] Verify dynamic library available in BOOST_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[BOOST] Verify static library is not present in BOOST_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_INC is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[BOOST] Verify header file is present in BOOST_INC (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[BOOST] Test interactive build/exec of f_test.cpp (gnu14/mvapich2)
Outcome:Passed
Duration:3.022 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.652 sec
Tests4
Failures0

Tests

rm_execution

Test case:[Boost/Multi Array] access-test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.162 sec
FailedNone
None
Test case:[Boost/Multi Array] iterators-test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.167 sec
FailedNone
None
Test case:[Boost/Multi Array] resize-test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.146 sec
FailedNone
None
Test case:[Boost/Multi Array] idxgen1-test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.177 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.665 sec
Tests4
Failures0

Tests

rm_execution

Test case:[Boost/Multi Array] access-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.171 sec
FailedNone
None
Test case:[Boost/Multi Array] iterators-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.176 sec
FailedNone
None
Test case:[Boost/Multi Array] resize-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.155 sec
FailedNone
None
Test case:[Boost/Multi Array] idxgen1-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.163 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.646 sec
Tests4
Failures0

Tests

rm_execution

Test case:[Boost/Multi Array] access-test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.165 sec
FailedNone
None
Test case:[Boost/Multi Array] iterators-test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.168 sec
FailedNone
None
Test case:[Boost/Multi Array] resize-test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.153 sec
FailedNone
None
Test case:[Boost/Multi Array] idxgen1-test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.16 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.369 sec
Tests1
Failures0

Tests

rm_execution

Test case:[Boost/ublas] bench1_ublas binary under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:1.369 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.361 sec
Tests1
Failures0

Tests

rm_execution

Test case:[Boost/ublas] bench1_ublas binary under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:1.361 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.357 sec
Tests1
Failures0

Tests

rm_execution

Test case:[Boost/ublas] bench1_ublas binary under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:1.357 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.446 sec
Tests11
Failures0

Tests

rm_execution

Test case:[Boost/Program Options] cmdline_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.176 sec
FailedNone
None
Test case:[Boost/Program Options] exception_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.163 sec
FailedNone
None
Test case:[Boost/Program Options] options_description_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.154 sec
FailedNone
None
Test case:[Boost/Program Options] parsers_test on master host(gnu14/mpich)
Outcome:Passed
Duration:0.029 sec
FailedNone
None
Test case:[Boost/Program Options] parsers_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.15 sec
FailedNone
None
Test case:[Boost/Program Options] positional_options_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.145 sec
FailedNone
None
Test case:[Boost/Program Options] required_test on master host(gnu14/mpich)
Outcome:Passed
Duration:0.028 sec
FailedNone
None
Test case:[Boost/Program Options] required_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.162 sec
FailedNone
None
Test case:[Boost/Program Options] unicode_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.147 sec
FailedNone
None
Test case:[Boost/Program Options] unrecognized_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.143 sec
FailedNone
None
Test case:[Boost/Program Options] variable_map_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.149 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.509 sec
Tests11
Failures0

Tests

rm_execution

Test case:[Boost/Program Options] cmdline_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.175 sec
FailedNone
None
Test case:[Boost/Program Options] exception_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.149 sec
FailedNone
None
Test case:[Boost/Program Options] options_description_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.173 sec
FailedNone
None
Test case:[Boost/Program Options] parsers_test on master host(gnu14/openmpi5)
Outcome:Passed
Duration:0.03 sec
FailedNone
None
Test case:[Boost/Program Options] parsers_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.165 sec
FailedNone
None
Test case:[Boost/Program Options] positional_options_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.144 sec
FailedNone
None
Test case:[Boost/Program Options] required_test on master host(gnu14/openmpi5)
Outcome:Passed
Duration:0.028 sec
FailedNone
None
Test case:[Boost/Program Options] required_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.156 sec
FailedNone
None
Test case:[Boost/Program Options] unicode_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.154 sec
FailedNone
None
Test case:[Boost/Program Options] unrecognized_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.165 sec
FailedNone
None
Test case:[Boost/Program Options] variable_map_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.17 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.529 sec
Tests11
Failures0

Tests

rm_execution

Test case:[Boost/Program Options] cmdline_test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.174 sec
FailedNone
None
Test case:[Boost/Program Options] exception_test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.167 sec
FailedNone
None
Test case:[Boost/Program Options] options_description_test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.157 sec
FailedNone
None
Test case:[Boost/Program Options] parsers_test on master host(gnu14/mvapich2)
Outcome:Passed
Duration:0.028 sec
FailedNone
None
Test case:[Boost/Program Options] parsers_test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.147 sec
FailedNone
None
Test case:[Boost/Program Options] positional_options_test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.17 sec
FailedNone
None
Test case:[Boost/Program Options] required_test on master host(gnu14/mvapich2)
Outcome:Passed
Duration:0.027 sec
FailedNone
None
Test case:[Boost/Program Options] required_test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.166 sec
FailedNone
None
Test case:[Boost/Program Options] unicode_test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.168 sec
FailedNone
None
Test case:[Boost/Program Options] unrecognized_test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.157 sec
FailedNone
None
Test case:[Boost/Program Options] variable_map_test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.168 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.914 sec
Tests4
Failures0

Tests

rm_execution

Test case:[Boost/Random] test_piecewise_linear under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:1.183 sec
FailedNone
None
Test case:[Boost/Random] test_discrete under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.41 sec
FailedNone
None
Test case:[Boost/Random] test_random_device under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.166 sec
FailedNone
None
Test case:[Boost/Random] test_random_number_generator under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.155 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.904 sec
Tests4
Failures0

Tests

rm_execution

Test case:[Boost/Random] test_piecewise_linear under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:1.175 sec
FailedNone
None
Test case:[Boost/Random] test_discrete under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.403 sec
FailedNone
None
Test case:[Boost/Random] test_random_device under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.171 sec
FailedNone
None
Test case:[Boost/Random] test_random_number_generator under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.155 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.884 sec
Tests4
Failures0

Tests

rm_execution

Test case:[Boost/Random] test_piecewise_linear under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:1.149 sec
FailedNone
None
Test case:[Boost/Random] test_discrete under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.412 sec
FailedNone
None
Test case:[Boost/Random] test_random_device under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.164 sec
FailedNone
None
Test case:[Boost/Random] test_random_number_generator under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.159 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.213 sec
Tests1
Failures0

Tests

rm_execution

Test case:[Boost] regress under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.213 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.943 sec
Tests3
Failures0

Tests

rm_execution

Test case:[Boost] bad_expression_test under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.589 sec
FailedNone
None
Test case:[Boost] named_subexpressions_test under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.148 sec
FailedNone
None
Test case:[Boost] recursion_test under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.206 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.965 sec
Tests3
Failures0

Tests

rm_execution

Test case:[Boost] bad_expression_test under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.589 sec
FailedNone
None
Test case:[Boost] named_subexpressions_test under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.165 sec
FailedNone
None
Test case:[Boost] recursion_test under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.211 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.976 sec
Tests3
Failures0

Tests

rm_execution

Test case:[Boost] bad_expression_test under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.585 sec
FailedNone
None
Test case:[Boost] named_subexpressions_test under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.179 sec
FailedNone
None
Test case:[Boost] recursion_test under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.212 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.337 sec
Tests7
Failures0

Tests

test_module

Test case:[BOOST] Verify BOOST module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.214 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Verify dynamic library available in BOOST_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[BOOST] Verify static library is not present in BOOST_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[BOOST] Verify header file is present in BOOST_INC (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.335 sec
Tests7
Failures0

Tests

test_module

Test case:[BOOST] Verify BOOST module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.208 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[BOOST] Verify dynamic library available in BOOST_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Verify static library is not present in BOOST_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Verify header file is present in BOOST_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.326 sec
Tests7
Failures0

Tests

test_module

Test case:[BOOST] Verify BOOST module is loaded and matches rpm version (gnu14/mvapich2)
Outcome:Passed
Duration:0.204 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_DIR is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_LIB is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[BOOST] Verify dynamic library available in BOOST_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[BOOST] Verify static library is not present in BOOST_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_INC is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[BOOST] Verify header file is present in BOOST_INC (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration17.793 sec
Tests6
Failures0

Tests

rm_execution

Test case:[Boost/MPI] all_gather_test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.259 sec
FailedNone
None
Test case:[Boost/MPI] all_reduce_test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:3.319 sec
FailedNone
None
Test case:[Boost/MPI] all_to_all_test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.257 sec
FailedNone
None
Test case:[Boost/MPI] groups_test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:3.321 sec
FailedNone
None
Test case:[Boost/MPI] ring_test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.254 sec
FailedNone
None
Test case:[Boost/MPI] pointer_test under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:4.383 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration15.724 sec
Tests6
Failures0

Tests

rm_execution

Test case:[Boost/MPI] all_gather_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.261 sec
FailedNone
None
Test case:[Boost/MPI] all_reduce_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.331 sec
FailedNone
None
Test case:[Boost/MPI] all_to_all_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.264 sec
FailedNone
None
Test case:[Boost/MPI] groups_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.264 sec
FailedNone
None
Test case:[Boost/MPI] ring_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.334 sec
FailedNone
None
Test case:[Boost/MPI] pointer_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.27 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration22.16 sec
Tests6
Failures0

Tests

rm_execution

Test case:[Boost/MPI] all_gather_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.403 sec
FailedNone
None
Test case:[Boost/MPI] all_reduce_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.334 sec
FailedNone
None
Test case:[Boost/MPI] all_to_all_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.34 sec
FailedNone
None
Test case:[Boost/MPI] groups_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.34 sec
FailedNone
None
Test case:[Boost/MPI] ring_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.342 sec
FailedNone
None
Test case:[Boost/MPI] pointer_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.401 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.303 sec
Tests7
Failures0

Tests

test_module

Test case:[FFTW] Verify FFTW module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.179 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[FFTW] Verify dynamic library available in FFTW_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[FFTW] Verify static library is not present in FFTW_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[FFTW] Verify header file is present in FFTW_INC (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.305 sec
Tests7
Failures0

Tests

test_module

Test case:[FFTW] Verify FFTW module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.179 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[FFTW] Verify dynamic library available in FFTW_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[FFTW] Verify static library is not present in FFTW_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[FFTW] Verify header file is present in FFTW_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.295 sec
Tests7
Failures0

Tests

test_module

Test case:[FFTW] Verify FFTW module is loaded and matches rpm version (gnu14/mvapich2)
Outcome:Passed
Duration:0.172 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_DIR is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_LIB is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[FFTW] Verify dynamic library available in FFTW_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[FFTW] Verify static library is not present in FFTW_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_INC is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[FFTW] Verify header file is present in FFTW_INC (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration4.914 sec
Tests3
Failures0

Tests

rm_execution

Test case:[libs/FFTW] Serial C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.304 sec
FailedNone
None
Test case:[libs/FFTW] MPI C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:4.314 sec
FailedNone
None
Test case:[libs/FFTW] Serial Fortran binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.296 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration6.064 sec
Tests3
Failures0

Tests

rm_execution

Test case:[libs/FFTW] Serial C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.349 sec
FailedNone
None
Test case:[libs/FFTW] MPI C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:5.378 sec
FailedNone
None
Test case:[libs/FFTW] Serial Fortran binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.337 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration16.157 sec
Tests3
Failures0

Tests

rm_execution

Test case:[libs/FFTW] Serial C binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.175 sec
FailedNone
None
Test case:[libs/FFTW] MPI C binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:15.811 sec
FailedNone
None
Test case:[libs/FFTW] Serial Fortran binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.171 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration36.736 sec
Tests50
Failures0

Tests

rm_execution

Test case:[libs/GSL] run test_gsl_histogram (gnu14)
Outcome:Passed
Duration:0.041 sec
FailedNone
None
Test case:[libs/GSL] run block under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.267 sec
FailedNone
None
Test case:[libs/GSL] run bspline under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.529 sec
FailedNone
None
Test case:[libs/GSL] run cblas under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.163 sec
FailedNone
None
Test case:[libs/GSL] run cdf under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.167 sec
FailedNone
None
Test case:[libs/GSL] run cheb under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.168 sec
FailedNone
None
Test case:[libs/GSL] run combination under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.155 sec
FailedNone
None
Test case:[libs/GSL] run complex under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.164 sec
FailedNone
None
Test case:[libs/GSL] run const under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.148 sec
FailedNone
None
Test case:[libs/GSL] run deriv under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.151 sec
FailedNone
None
Test case:[libs/GSL] run dht under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.208 sec
FailedNone
None
Test case:[libs/GSL] run diff under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.153 sec
FailedNone
None
Test case:[libs/GSL] run eigen under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.33 sec
FailedNone
None
Test case:[libs/GSL] run err under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.162 sec
FailedNone
None
Test case:[libs/GSL] run fft under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.5 sec
FailedNone
None
Test case:[libs/GSL] run fit under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.153 sec
FailedNone
None
Test case:[libs/GSL] run histogram under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.22 sec
FailedNone
None
Test case:[libs/GSL] run ieee-utils under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.142 sec
FailedNone
None
Test case:[libs/GSL] run integration under resource manager (slurm/gnu14)
Outcome:Passed
Duration:3.03 sec
FailedNone
None
Test case:[libs/GSL] run interpolation under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.164 sec
FailedNone
None
Test case:[libs/GSL] run linalg under resource manager (slurm/gnu14)
Outcome:Passed
Duration:1.284 sec
FailedNone
None
Test case:[libs/GSL] run matrix under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.732 sec
FailedNone
None
Test case:[libs/GSL] run min under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.167 sec
FailedNone
None
Test case:[libs/GSL] run monte under resource manager (slurm/gnu14)
Outcome:Passed
Duration:1.415 sec
FailedNone
None
Test case:[libs/GSL] run multifit under resource manager (slurm/gnu14)
Outcome:Passed
Duration:1.823 sec
FailedNone
None
Test case:[libs/GSL] run multilarge under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.202 sec
FailedNone
None
Test case:[libs/GSL] run multimin under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.172 sec
FailedNone
None
Test case:[libs/GSL] run multiroots under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.153 sec
FailedNone
None
Test case:[libs/GSL] run multiset under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.137 sec
FailedNone
None
Test case:[libs/GSL] run ntuple under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.16 sec
FailedNone
None
Test case:[libs/GSL] run ode-initval under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.431 sec
FailedNone
None
Test case:[libs/GSL] run ode-initval2 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:4.923 sec
FailedNone
None
Test case:[libs/GSL] run permutation under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.159 sec
FailedNone
None
Test case:[libs/GSL] run poly under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.152 sec
FailedNone
None
Test case:[libs/GSL] run qrng under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.149 sec
FailedNone
None
Test case:[libs/GSL] run randist under resource manager (slurm/gnu14)
Outcome:Passed
Duration:1.342 sec
FailedNone
None
Test case:[libs/GSL] run rng under resource manager (slurm/gnu14)
Outcome:Passed
Duration:2.961 sec
FailedNone
None
Test case:[libs/GSL] run roots under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.159 sec
FailedNone
None
Test case:[libs/GSL] run rstat under resource manager (slurm/gnu14)
Outcome:Passed
Duration:1.493 sec
FailedNone
None
Test case:[libs/GSL] run siman under resource manager (slurm/gnu14)
Outcome:Passed
Duration:1.281 sec
FailedNone
None
Test case:[libs/GSL] run sort under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.248 sec
FailedNone
None
Test case:[libs/GSL] run spblas under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.363 sec
FailedNone
None
Test case:[libs/GSL] run specfunc under resource manager (slurm/gnu14)
Outcome:Passed
Duration:2.31 sec
FailedNone
None
Test case:[libs/GSL] run splinalg under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.359 sec
FailedNone
None
Test case:[libs/GSL] run spmatrix under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.192 sec
FailedNone
None
Test case:[libs/GSL] run statistics under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.159 sec
FailedNone
None
Test case:[libs/GSL] run sum under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.148 sec
FailedNone
None
Test case:[libs/GSL] run sys under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.159 sec
FailedNone
None
Test case:[libs/GSL] run vector under resource manager (slurm/gnu14)
Outcome:Passed
Duration:5.692 sec
FailedNone
None
Test case:[libs/GSL] run wavelet under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.796 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.284 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/GSL] Verify GSL module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.163 sec
FailedNone
None
Test case:[libs/GSL] Verify module GSL_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/GSL] Verify module GSL_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/GSL] Verify dynamic library available in GSL_LIB (gnu14)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[libs/GSL] Verify static library is not present in GSL_LIB (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/GSL] Verify module GSL_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/GSL] Verify header file is present in GSL_INC (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration33.377 sec
Tests12
Failures0

Tests

rm_execution

Test case:[libs/HYPRE] 2 PE structured test binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.356 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE PCG with SMG preconditioner binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.356 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE Semi-Structured PCG binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.348 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE Three-part stencil binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.423 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE FORTRAN PCG with PFMG preconditioner binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.351 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE -Delta u = 1 binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.418 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE convection-reaction-diffusion binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.345 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE FORTRAN 2-D Laplacian binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.425 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE Semi-Structured convection binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[libs/HYPRE] 2 PE biharmonic binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[libs/HYPRE] 2 PE C++ Finite Element Interface binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
C++ example depends on non-installed header
Test case:[libs/HYPRE] 2 PE 2-D Laplacian eigenvalue binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.355 sec
FailedNone
None

Test Suite: test_module

Results

Duration3.695 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/HYPRE] Verify HYPRE module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.2 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/HYPRE] Verify dynamic library available in HYPRE_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/HYPRE] Verify static library is not present in HYPRE_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/HYPRE] Verify header file is present in HYPRE_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/HYPRE] Sample job (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.346 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration24.689 sec
Tests12
Failures0

Tests

rm_execution

Test case:[libs/HYPRE] 2 PE structured test binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.272 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE PCG with SMG preconditioner binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.268 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE Semi-Structured PCG binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.268 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE Three-part stencil binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:4.401 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE FORTRAN PCG with PFMG preconditioner binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.27 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE -Delta u = 1 binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:3.334 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE convection-reaction-diffusion binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.269 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE FORTRAN 2-D Laplacian binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:1.203 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE Semi-Structured convection binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[libs/HYPRE] 2 PE biharmonic binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[libs/HYPRE] 2 PE C++ Finite Element Interface binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
C++ example depends on non-installed header
Test case:[libs/HYPRE] 2 PE 2-D Laplacian eigenvalue binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:4.404 sec
FailedNone
None

Test Suite: test_module

Results

Duration2.601 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/HYPRE] Verify HYPRE module is loaded and matches rpm version (gnu14/mvapich2)
Outcome:Passed
Duration:0.193 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_DIR is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_BIN is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_LIB is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/HYPRE] Verify dynamic library available in HYPRE_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/HYPRE] Verify static library is not present in HYPRE_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_INC is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/HYPRE] Verify header file is present in HYPRE_INC (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/HYPRE] Sample job (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.264 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration27.935 sec
Tests12
Failures0

Tests

rm_execution

Test case:[libs/HYPRE] 2 PE structured test binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.274 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE PCG with SMG preconditioner binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.342 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE Semi-Structured PCG binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.276 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE Three-part stencil binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.273 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE FORTRAN PCG with PFMG preconditioner binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.339 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE -Delta u = 1 binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.342 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE convection-reaction-diffusion binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.272 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE FORTRAN 2-D Laplacian binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:4.411 sec
FailedNone
None
Test case:[libs/HYPRE] 2 PE Semi-Structured convection binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[libs/HYPRE] 2 PE biharmonic binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[libs/HYPRE] 2 PE C++ Finite Element Interface binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
C++ example depends on non-installed header
Test case:[libs/HYPRE] 2 PE 2-D Laplacian eigenvalue binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:4.406 sec
FailedNone
None

Test Suite: test_module

Results

Duration2.618 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/HYPRE] Verify HYPRE module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.197 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/HYPRE] Verify dynamic library available in HYPRE_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/HYPRE] Verify static library is not present in HYPRE_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/HYPRE] Verify header file is present in HYPRE_INC (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/HYPRE] Sample job (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.273 sec
FailedNone
None

Test Suite: test_metis

Results

Duration3.283 sec
Tests4
Failures0

Tests

test_metis

Test case:[libs/Metis] Graph partition (gnu14)
Outcome:Passed
Duration:0.349 sec
FailedNone
None
Test case:[libs/Metis] Fill-reducing ordering (gnu14)
Outcome:Passed
Duration:2.448 sec
FailedNone
None
Test case:[libs/Metis] Mesh to graph conversion (gnu14)
Outcome:Passed
Duration:0.058 sec
FailedNone
None
Test case:[libs/Metis] C API mesh partitioning (slurm/gnu14)
Outcome:Passed
Duration:0.428 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.361 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/Metis] Verify METIS module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.174 sec
FailedNone
None
Test case:[libs/Metis] Verify module METIS_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/Metis] Verify module METIS_BIN is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/Metis] Verify availability of m2gmetis binary (gnu14)
Outcome:Passed
Duration:0.044 sec
FailedNone
None
Test case:[libs/Metis] Verify module METIS_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/Metis] Verify dynamic library available in METIS_LIB (gnu14)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[libs/Metis] Verify static library is not present in METIS_LIB (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/Metis] Verify module METIS_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/Metis] Verify header file is present in METIS_INC (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration12.511 sec
Tests4
Failures0

Tests

rm_execution

Test case:[libs/MFEM] p_laplace MPI binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.397 sec
FailedNone
None
Test case:[libs/MFEM] p_laplace_perf MPI binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.399 sec
FailedNone
None
Test case:[libs/MFEM] p_cantilever MPI binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.32 sec
FailedNone
None
Test case:[libs/MFEM] p_diffusion MPI binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.395 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration16.844 sec
Tests4
Failures0

Tests

rm_execution

Test case:[libs/MFEM] p_laplace MPI binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.482 sec
FailedNone
None
Test case:[libs/MFEM] p_laplace_perf MPI binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.47 sec
FailedNone
None
Test case:[libs/MFEM] p_cantilever MPI binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.407 sec
FailedNone
None
Test case:[libs/MFEM] p_diffusion MPI binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.485 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration13.536 sec
Tests4
Failures0

Tests

rm_execution

Test case:[libs/MFEM] p_laplace MPI binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.309 sec
FailedNone
None
Test case:[libs/MFEM] p_laplace_perf MPI binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:7.683 sec
FailedNone
None
Test case:[libs/MFEM] p_cantilever MPI binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:1.231 sec
FailedNone
None
Test case:[libs/MFEM] p_diffusion MPI binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.313 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.353 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/MFEM] Verify MFEM module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.215 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.025 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/MFEM] Verify (dynamic) library available in MFEM_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/MFEM] Verify header file is present in MFEM_INC (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.356 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/MFEM] Verify MFEM module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.218 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/MFEM] Verify (dynamic) library available in MFEM_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/MFEM] Verify header file is present in MFEM_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.353 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/MFEM] Verify MFEM module is loaded and matches rpm version (gnu14/mvapich2)
Outcome:Passed
Duration:0.218 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_DIR is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_BIN is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_LIB is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/MFEM] Verify (dynamic) library available in MFEM_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_INC is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/MFEM] Verify header file is present in MFEM_INC (gnu14/mvapich2)
Outcome:Passed
Duration:0.022 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration13.481 sec
Tests5
Failures0

Tests

rm_execution

Test case:[libs/Mumps] C (double precision) runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.272 sec
FailedNone
None
Test case:[libs/Mumps] Fortran (single precision) runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:1.206 sec
FailedNone
None
Test case:[libs/Mumps] Fortran (double precision) runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:3.33 sec
FailedNone
None
Test case:[libs/Mumps] Fortran (complex) runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:4.4 sec
FailedNone
None
Test case:[libs/Mumps] Fortran (double complex) runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.273 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.207 sec
Tests2
Failures0

Tests

test_module

Test case:[libs/MUMPS] Verify MUMPS module is loaded and matches rpm version (gnu14-mvapich2)
Outcome:Passed
Duration:0.187 sec
FailedNone
None
Test case:[libs/MUMPS] Verify MUMPS_DIR is defined and directory exists (gnu14-mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration17.802 sec
Tests5
Failures0

Tests

rm_execution

Test case:[libs/Mumps] C (double precision) runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.346 sec
FailedNone
None
Test case:[libs/Mumps] Fortran (single precision) runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.411 sec
FailedNone
None
Test case:[libs/Mumps] Fortran (double precision) runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.349 sec
FailedNone
None
Test case:[libs/Mumps] Fortran (complex) runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.349 sec
FailedNone
None
Test case:[libs/Mumps] Fortran (double complex) runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.347 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.219 sec
Tests2
Failures0

Tests

test_module

Test case:[libs/MUMPS] Verify MUMPS module is loaded and matches rpm version (gnu14-openmpi5)
Outcome:Passed
Duration:0.198 sec
FailedNone
None
Test case:[libs/MUMPS] Verify MUMPS_DIR is defined and directory exists (gnu14-openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration13.501 sec
Tests5
Failures0

Tests

rm_execution

Test case:[libs/Mumps] C (double precision) runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.269 sec
FailedNone
None
Test case:[libs/Mumps] Fortran (single precision) runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.273 sec
FailedNone
None
Test case:[libs/Mumps] Fortran (double precision) runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.276 sec
FailedNone
None
Test case:[libs/Mumps] Fortran (complex) runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.342 sec
FailedNone
None
Test case:[libs/Mumps] Fortran (double complex) runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.341 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.207 sec
Tests2
Failures0

Tests

test_module

Test case:[libs/MUMPS] Verify MUMPS module is loaded and matches rpm version (gnu14-mpich)
Outcome:Passed
Duration:0.187 sec
FailedNone
None
Test case:[libs/MUMPS] Verify MUMPS_DIR is defined and directory exists (gnu14-mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: test_C_module

Results

Duration0.376 sec
Tests9
Failures0

Tests

test_C_module

Test case:[libs/NetCDF] Verify NETCDF module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.177 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_BIN is defined and exists (gnu14)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF] Verify availability of nc-config binary (gnu14)
Outcome:Passed
Duration:0.05 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF] Verify dynamic library available in NETCDF_LIB (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF] Verify static library is not present in NETCDF_LIB (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/NetCDF] Verify header file is present in NETCDF_INC (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: test_CXX_module

Results

Duration0.386 sec
Tests9
Failures0

Tests

test_CXX_module

Test case:[libs/NetCDF-CXX] Verify NETCDF_CXX module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.188 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_BIN is defined and exists (gnu14)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify availability of ncxx4-config binary (gnu14)
Outcome:Passed
Duration:0.049 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify dynamic library available in NETCDF_CXX_LIB (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify static library is not present in NETCDF_CXX_LIB (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify header file is present in NETCDF_CXX_INC (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: test_Fortran_module

Results

Duration0.385 sec
Tests9
Failures0

Tests

test_Fortran_module

Test case:[libs/NetCDF-Fortran] Verify NETCDF_FORTRAN module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.187 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_BIN is defined and exists (gnu14)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify availability of nf-config binary (gnu14)
Outcome:Passed
Duration:0.049 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify dynamic library available in NETCDF_FORTRAN_LIB (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify static library is not present in NETCDF_FORTRAN_LIB (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify header file is present in NETCDF_FORTRAN_INC (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: test_netcdf

Results

Duration1.584 sec
Tests7
Failures0

Tests

test_netcdf

Test case:[libs/NetCDF] ncdump availability (gnu14)
Outcome:Passed
Duration:0.065 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4/hdf5 available for C interface (gnu14)
Outcome:Passed
Duration:0.059 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for Fortran interface (gnu14)
Outcome:Passed
Duration:0.039 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for C++ interface (gnu14)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
option no longer supported
Test case:[libs/NetCDF] C write/read (slurm/gnu14)
Outcome:Passed
Duration:0.524 sec
FailedNone
None
Test case:[libs/NetCDF] Fortran write/read (slurm/gnu14)
Outcome:Passed
Duration:0.451 sec
FailedNone
None
Test case:[libs/NetCDF] C++ write/read (slurm/gnu14)
Outcome:Passed
Duration:0.446 sec
FailedNone
None

Test Suite: test_C_module

Results

Duration0.379 sec
Tests9
Failures0

Tests

test_C_module

Test case:[libs/NetCDF] Verify NETCDF module is loaded and matches rpm version (gnu14/mvapich2)
Outcome:Passed
Duration:0.182 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_DIR is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_BIN is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF] Verify availability of nc-config binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.05 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_LIB is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF] Verify dynamic library available in NETCDF_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/NetCDF] Verify static library is not present in NETCDF_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_INC is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/NetCDF] Verify header file is present in NETCDF_INC (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: test_CXX_module

Results

Duration0.39 sec
Tests9
Failures0

Tests

test_CXX_module

Test case:[libs/NetCDF-CXX] Verify NETCDF_CXX module is loaded and matches rpm version (gnu14/mvapich2)
Outcome:Passed
Duration:0.191 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_DIR is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_BIN is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify availability of ncxx4-config binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.05 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_LIB is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify dynamic library available in NETCDF_CXX_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify static library is not present in NETCDF_CXX_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_INC is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify header file is present in NETCDF_CXX_INC (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: test_Fortran_module

Results

Duration0.39 sec
Tests9
Failures0

Tests

test_Fortran_module

Test case:[libs/NetCDF-Fortran] Verify NETCDF_FORTRAN module is loaded and matches rpm version (gnu14/mvapich2)
Outcome:Passed
Duration:0.189 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_DIR is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_BIN is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify availability of nf-config binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.049 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_LIB is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify dynamic library available in NETCDF_FORTRAN_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify static library is not present in NETCDF_FORTRAN_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_INC is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify header file is present in NETCDF_FORTRAN_INC (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: test_netcdf

Results

Duration1.743 sec
Tests7
Failures0

Tests

test_netcdf

Test case:[libs/NetCDF] ncdump availability (gnu14/mvapich2)
Outcome:Passed
Duration:0.07 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4/hdf5 available for C interface (gnu14/mvapich2)
Outcome:Passed
Duration:0.061 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for Fortran interface (gnu14/mvapich2)
Outcome:Passed
Duration:0.039 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for C++ interface (gnu14/mvapich2)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
option no longer supported
Test case:[libs/NetCDF] C write/read (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.589 sec
FailedNone
None
Test case:[libs/NetCDF] Fortran write/read (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.5 sec
FailedNone
None
Test case:[libs/NetCDF] C++ write/read (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.484 sec
FailedNone
None

Test Suite: test_C_module

Results

Duration0.393 sec
Tests9
Failures0

Tests

test_C_module

Test case:[libs/NetCDF] Verify NETCDF module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.189 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/NetCDF] Verify availability of nc-config binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.052 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF] Verify dynamic library available in NETCDF_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF] Verify static library is not present in NETCDF_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/NetCDF] Verify header file is present in NETCDF_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None

Test Suite: test_CXX_module

Results

Duration0.403 sec
Tests9
Failures0

Tests

test_CXX_module

Test case:[libs/NetCDF-CXX] Verify NETCDF_CXX module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.201 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify availability of ncxx4-config binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.051 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify dynamic library available in NETCDF_CXX_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify static library is not present in NETCDF_CXX_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify header file is present in NETCDF_CXX_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: test_Fortran_module

Results

Duration0.405 sec
Tests9
Failures0

Tests

test_Fortran_module

Test case:[libs/NetCDF-Fortran] Verify NETCDF_FORTRAN module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.202 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify availability of nf-config binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.051 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify dynamic library available in NETCDF_FORTRAN_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify static library is not present in NETCDF_FORTRAN_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify header file is present in NETCDF_FORTRAN_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: test_netcdf

Results

Duration4.129 sec
Tests7
Failures0

Tests

test_netcdf

Test case:[libs/NetCDF] ncdump availability (gnu14/openmpi5)
Outcome:Passed
Duration:0.232 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4/hdf5 available for C interface (gnu14/openmpi5)
Outcome:Passed
Duration:0.06 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for Fortran interface (gnu14/openmpi5)
Outcome:Passed
Duration:0.04 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for C++ interface (gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
option no longer supported
Test case:[libs/NetCDF] C write/read (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:1.43 sec
FailedNone
None
Test case:[libs/NetCDF] Fortran write/read (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:1.183 sec
FailedNone
None
Test case:[libs/NetCDF] C++ write/read (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:1.184 sec
FailedNone
None

Test Suite: test_C_module

Results

Duration0.385 sec
Tests9
Failures0

Tests

test_C_module

Test case:[libs/NetCDF] Verify NETCDF module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.185 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF] Verify availability of nc-config binary (gnu14/mpich)
Outcome:Passed
Duration:0.051 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF] Verify dynamic library available in NETCDF_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/NetCDF] Verify static library is not present in NETCDF_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/NetCDF] Verify header file is present in NETCDF_INC (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: test_CXX_module

Results

Duration0.39 sec
Tests9
Failures0

Tests

test_CXX_module

Test case:[libs/NetCDF-CXX] Verify NETCDF_CXX module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.19 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify availability of ncxx4-config binary (gnu14/mpich)
Outcome:Passed
Duration:0.05 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify dynamic library available in NETCDF_CXX_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify static library is not present in NETCDF_CXX_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify header file is present in NETCDF_CXX_INC (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: test_Fortran_module

Results

Duration0.39 sec
Tests9
Failures0

Tests

test_Fortran_module

Test case:[libs/NetCDF-Fortran] Verify NETCDF_FORTRAN module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.189 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify availability of nf-config binary (gnu14/mpich)
Outcome:Passed
Duration:0.051 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify dynamic library available in NETCDF_FORTRAN_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify static library is not present in NETCDF_FORTRAN_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify header file is present in NETCDF_FORTRAN_INC (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: test_netcdf

Results

Duration3.453 sec
Tests7
Failures0

Tests

test_netcdf

Test case:[libs/NetCDF] ncdump availability (gnu14/mpich)
Outcome:Passed
Duration:0.181 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4/hdf5 available for C interface (gnu14/mpich)
Outcome:Passed
Duration:0.06 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for Fortran interface (gnu14/mpich)
Outcome:Passed
Duration:0.04 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for C++ interface (gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
option no longer supported
Test case:[libs/NetCDF] C write/read (slurm/gnu14/mpich)
Outcome:Passed
Duration:1.133 sec
FailedNone
None
Test case:[libs/NetCDF] Fortran write/read (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.97 sec
FailedNone
None
Test case:[libs/NetCDF] C++ write/read (slurm/gnu14/mpich)
Outcome:Passed
Duration:1.069 sec
FailedNone
None

Test Suite: test_pnetcdf

Results

Duration6.343 sec
Tests2
Failures0

Tests

test_pnetcdf

Test case:[libs/NetCDF] C parallel I/O (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.64 sec
FailedNone
None
Test case:[libs/NetCDF] Fortran parallel I/O (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.703 sec
FailedNone
None

Test Suite: test_pnetcdf

Results

Duration9.778 sec
Tests2
Failures0

Tests

test_pnetcdf

Test case:[libs/NetCDF] C parallel I/O (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.897 sec
FailedNone
None
Test case:[libs/NetCDF] Fortran parallel I/O (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.881 sec
FailedNone
None

Test Suite: test_pnetcdf

Results

Duration4.789 sec
Tests2
Failures0

Tests

test_pnetcdf

Test case:[libs/NetCDF] C parallel I/O (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.396 sec
FailedNone
None
Test case:[libs/NetCDF] Fortran parallel I/O (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.393 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration3.445 sec
Tests11
Failures0

Tests

rm_execution

Test case:[libs/openblas/dblat1] dblat1 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.178 sec
FailedNone
None
Test case:[libs/openblas/xccblat1] xccblat1 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.169 sec
FailedNone
None
Test case:[libs/openblas/xzcblat1] xzcblat1 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.156 sec
FailedNone
None
Test case:[libs/openblas/xscblat2] xscblat2 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.316 sec
FailedNone
None
Test case:[libs/openblas/xdcblat2] xdcblat2 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.336 sec
FailedNone
None
Test case:[libs/openblas/xccblat2] xccblat2 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.575 sec
FailedNone
None
Test case:[libs/openblas/xzcblat2] xzcblat2 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.474 sec
FailedNone
None
Test case:[libs/openblas/xscblat3] xscblat3 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.196 sec
FailedNone
None
Test case:[libs/openblas/xdcblat3] xdcblat3 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.211 sec
FailedNone
None
Test case:[libs/openblas/xccblat3] xccblat3 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.215 sec
FailedNone
None
Test case:[libs/openblas/xzcblat3] xzcblat3 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.619 sec
FailedNone
None

Test Suite: test_lapack

Results

Duration0.272 sec
Tests1
Failures0

Tests

test_lapack

Test case:[libs/OpenBLAS/eigen] run lapack eigen-value solver (gnu14)
Outcome:Passed
Duration:0.272 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.248 sec
Tests5
Failures0

Tests

test_module

Test case:[libs/OpenBLAS] Verify OPENBLAS module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.169 sec
FailedNone
None
Test case:[libs/OpenBLAS] Verify module OPENBLAS_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/OpenBLAS] Verify module OPENBLAS_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[libs/OpenBLAS] Verify dynamic library available in OPENBLAS_LIB (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/OpenBLAS] Verify static library is not present in OPENBLAS_LIB (gnu14)
Outcome:Passed
Duration:0.019 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration2.217 sec
Tests2
Failures0

Tests

rm_execution

Test case:[libs/OpenCoarrays] hello_multiverse binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.217 sec
FailedNone
None
Test case:[libs/OpenCoarrays] hello_multiverse binary runs under resource manager using cafrun script (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
cafrun not supported for all MPI variants

Test Suite: sms_execution

Results

Duration0.477 sec
Tests1
Failures0

Tests

sms_execution

Test case:[libs/OpenCoarrays] hello_multiverse binary runs on head node (gnu14/mpich)
Outcome:Passed
Duration:0.477 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.345 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/opencoarrays] Verify opencoarrays module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.183 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify dynamic library available in OPENCOARRAYS_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify static library is not present in OPENCOARRAYS_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify header file is present in OPENCOARRAYS_INC (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify F90 module is present in OPENCOARRAYS_INC (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration153.011 sec
Tests2
Failures1

Tests

rm_execution

Test case:[libs/OpenCoarrays] hello_multiverse binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:153.011 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 39)
  `assert_success' failed

-- command failed --
status : 1
output (111 lines):
  job script = /tmp/job.ohpc-test.27093
  Batch job 310 submitted

  Job 310 encountered timeout...
  Reason=TimeLimit

  [prun] Master compute host = c1
  [prun] Resource manager = slurm
  [prun] Launch cmd = srun --mpi=pmix -- ./hello  (family=openmpi5)
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
  [c1][[58344,0],0][base/btl_base_am_rdma.c:959:am_rdma_process_rdma] BTL is not compatible with active-message RDMA
  [c1:11982] *** Process received signal ***
  [c1:11982] Signal: Aborted (6)
  [c1:11982] Signal code:  (-6)
  [c1:11982] [ 0] /lib64/libc.so.6(+0x3e6f0)[0x7fbb6a9946f0]
  [c1:11982] [ 1] /lib64/libc.so.6(+0x8b94c)[0x7fbb6a9e194c]
  [c1:11982] [ 2] /lib64/libc.so.6(raise+0x16)[0x7fbb6a994646]
  [c1:11982] [ 3] /lib64/libc.so.6(abort+0xd3)[0x7fbb6a97e7f3]
  [c1:11982] [ 4] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libopen-pal.so.80(+0xc531c)[0x7fbb6a8ef31c]
  [c1:11982] [ 5] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libopen-pal.so.80(mca_btl_uct_am_handler+0x6f)[0x7fbb6a8f213f]
  [c1:11982] [ 6] /opt/ohpc/pub/mpi/ucx-ohpc/1.17.0/lib/ucx/libuct_ib.so.0(+0x5732d)[0x7fbb68d5e32d]
  [c1:11982] [ 7] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libopen-pal.so.80(+0xc693b)[0x7fbb6a8f093b]
  [c1:11982] [ 8] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libopen-pal.so.80(+0xc6d4c)[0x7fbb6a8f0d4c]
  [c1:11982] [ 9] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libopen-pal.so.80(opal_progress+0x34)[0x7fbb6a853aa4]
  [c1:11982] [10] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libmpi.so.40(+0x7514c)[0x7fbb6afec14c]
  [c1:11982] [11] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libmpi.so.40(ompi_comm_nextcid+0x3f)[0x7fbb6afefaaf]
  [c1:11982] [12] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libmpi.so.40(ompi_comm_dup_with_info+0xdd)[0x7fbb6afe9b9d]
  [c1:11982] [13] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libmpi.so.40(+0x2604fe)[0x7fbb6b1d74fe]
  [c1:11982] [14] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libmpi.so.40(ompi_osc_base_select+0x11d)[0x7fbb6b1ae09d]
  [c1:11982] [15] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libmpi.so.40(ompi_win_create_dynamic+0x89)[0x7fbb6b017319]
  [c1:11982] [16] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libmpi.so.40(MPI_Win_create_dynamic+0xa8)[0x7fbb6b05c438]
  [c1:11982] [17] /opt/ohpc/pub/libs/gnu14/openmpi5/opencoarrays/2.10.2/lib64/libcaf_mpi.so.3(_gfortran_caf_init+0x24c)[0x7fbb6b4278bc]
  [c1:11982] [18] /opt/ohpc/pub/libs/gnu14/openmpi5/opencoarrays/2.10.2/lib64/libcaf_mpi.so.3(_gfortran_caf_register+0x1e9)[0x7fbb6b428219]
  [c1:11982] [19] /home/ohpc-test/tests/libs/opencoarrays/tests/./hello[0x401171]
  [c1:11982] [20] /lib64/libc.so.6(__libc_start_main+0xfb)[0x7fbb6a97f6bb]
  [c1:11982] [21] /home/ohpc-test/tests/libs/opencoarrays/tests/./hello[0x401205]
  [c1:11982] *** End of error message ***
  srun: error: c1: task 0: Aborted (core dumped)
  slurmstepd-c1: error: *** JOB 310 ON c1 CANCELLED AT 2024-11-03T19:37:26 DUE TO TIME LIMIT ***
  srun: Job step aborted: Waiting up to 32 seconds for job step to finish.
--
Test case:[libs/OpenCoarrays] hello_multiverse binary runs under resource manager using cafrun script (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
cafrun not supported for all MPI variants

Test Suite: sms_execution

Results

Duration1.61 sec
Tests1
Failures0

Tests

sms_execution

Test case:[libs/OpenCoarrays] hello_multiverse binary runs on head node (gnu14/openmpi5)
Outcome:Passed
Duration:1.61 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.353 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/opencoarrays] Verify opencoarrays module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.188 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify dynamic library available in OPENCOARRAYS_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify static library is not present in OPENCOARRAYS_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify header file is present in OPENCOARRAYS_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify F90 module is present in OPENCOARRAYS_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.159 sec
Tests2
Failures0

Tests

rm_execution

Test case:[libs/OpenCoarrays] hello_multiverse binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:1.159 sec
FailedNone
None
Test case:[libs/OpenCoarrays] hello_multiverse binary runs under resource manager using cafrun script (slurm/gnu14/mvapich2)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
cafrun not supported for all MPI variants

Test Suite: sms_execution

Results

Duration0.153 sec
Tests1
Failures0

Tests

sms_execution

Test case:[libs/OpenCoarrays] hello_multiverse binary runs on head node (gnu14/mvapich2)
Outcome:Passed
Duration:0.153 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.337 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/opencoarrays] Verify opencoarrays module is loaded and matches rpm version (gnu14/mvapich2)
Outcome:Passed
Duration:0.176 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_DIR is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_LIB is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify dynamic library available in OPENCOARRAYS_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify static library is not present in OPENCOARRAYS_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_INC is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify header file is present in OPENCOARRAYS_INC (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify F90 module is present in OPENCOARRAYS_INC (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_BIN is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration10.034 sec
Tests3
Failures0

Tests

rm_execution

Test case:[libs/PETSc] MPI C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.282 sec
FailedNone
None
Test case:[libs/PETSc] MPI F77 binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.342 sec
FailedNone
None
Test case:[libs/PETSc] MPI F90 binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:4.41 sec
FailedNone
None

Test Suite: test_module

Results

Duration2.636 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/PETSc] Verify PETSC module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.208 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_BIN is defined and exists
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Verify dynamic library available in PETSC_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PETSc] Verify static library is not present in PETSC_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Verify header file is present in PETSC_INC (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Sample job (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.28 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration10.078 sec
Tests3
Failures0

Tests

rm_execution

Test case:[libs/PETSc] MPI C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.358 sec
FailedNone
None
Test case:[libs/PETSc] MPI F77 binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.362 sec
FailedNone
None
Test case:[libs/PETSc] MPI F90 binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.358 sec
FailedNone
None

Test Suite: test_module

Results

Duration3.724 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/PETSc] Verify PETSC module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.215 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_BIN is defined and exists
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Verify dynamic library available in PETSC_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Verify static library is not present in PETSC_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Verify header file is present in PETSC_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PETSc] Sample job (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.358 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration7.891 sec
Tests3
Failures0

Tests

rm_execution

Test case:[libs/PETSc] MPI C binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.274 sec
FailedNone
None
Test case:[libs/PETSc] MPI F77 binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.276 sec
FailedNone
None
Test case:[libs/PETSc] MPI F90 binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:3.341 sec
FailedNone
None

Test Suite: test_module

Results

Duration1.567 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/PETSc] Verify PETSC module is loaded and matches rpm version (gnu14/mvapich2)
Outcome:Passed
Duration:0.208 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_DIR is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_BIN is defined and exists
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_LIB is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Verify dynamic library available in PETSC_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Verify static library is not present in PETSC_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_INC is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Verify header file is present in PETSC_INC (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Sample job (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:1.209 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.299 sec
Tests7
Failures0

Tests

test_module

Test case:[HDF5] Verify HDF5 module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.177 sec
FailedNone
None
Test case:[HDF5] Verify module HDF5_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[HDF5] Verify module HDF5_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[HDF5] Verify dynamic library available in HDF5_LIB (gnu14)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[HDF5] Verify static library is not present in HDF5_LIB (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[HDF5] Verify module HDF5_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[HDF5] Verify header file is present in HDF5_INC (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration3.474 sec
Tests2
Failures0

Tests

rm_execution

Test case:[libs/PHDF5] MPI C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.267 sec
FailedNone
None
Test case:[libs/PHDF5] Parallel Fortran binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:1.207 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration7.752 sec
Tests2
Failures0

Tests

rm_execution

Test case:[libs/PHDF5] MPI C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.408 sec
FailedNone
None
Test case:[libs/PHDF5] Parallel Fortran binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.344 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration3.458 sec
Tests2
Failures0

Tests

rm_execution

Test case:[libs/PHDF5] MPI C binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.26 sec
FailedNone
None
Test case:[libs/PHDF5] Parallel Fortran binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:1.198 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.522 sec
Tests3
Failures0

Tests

rm_execution

Test case:[libs/PLASMA/C_test] C_test under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.377 sec
FailedNone
None
Test case:[libs/PLASMA/F_test] F_test under resource manager (slurm/gnu14)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[libs/PLASMA/F90_test] F90_test under resource manager (slurm/gnu14)
Outcome:Passed
Duration:1.145 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.29 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/PLASMA] Verify plasma module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.169 sec
FailedNone
None
Test case:[libs/PLASMA] Verify module PLASMA_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PLASMA] Verify module PLASMA_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PLASMA] Verify dynamic library available in PLASMA_LIB (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PLASMA] Verify static library is not present in PLASMA_LIB (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PLASMA] Verify module PLASMA_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PLASMA] Verify header file is present in PLASMA_INC (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.26 sec
Tests5
Failures0

Tests

test_module

Test case:[PNETCDF] Verify PNETCDF module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.181 sec
FailedNone
None
Test case:[PNETCDF] Verify module PNETCDF_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[PNETCDF] Verify module PNETCDF_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[PNETCDF] Verify module PNETCDF_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[PNETCDF] Verify header file is present in PNETCDF_INC (gnu14)
Outcome:Passed
Duration:0.019 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration10.1 sec
Tests4
Failures0

Tests

rm_execution

Test case:[libs/PNETCDF] Parallel Fortran binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.259 sec
FailedNone
None
Test case:[libs/PNETCDF] Parallel Fortran binary 2 runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:1.198 sec
FailedNone
None
Test case:[libs/PNETCDF] Parallel Fortran binary 3 runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:4.382 sec
FailedNone
None
Test case:[libs/PNETCDF] Parallel Fortran binary 4 runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.261 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration45.269 sec
Tests4
Failures0

Tests

rm_execution

Test case:[libs/PNETCDF] Parallel Fortran binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.402 sec
FailedNone
None
Test case:[libs/PNETCDF] Parallel Fortran binary 2 runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:34.197 sec
FailedNone
None
Test case:[libs/PNETCDF] Parallel Fortran binary 3 runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.335 sec
FailedNone
None
Test case:[libs/PNETCDF] Parallel Fortran binary 4 runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.335 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration10.09 sec
Tests4
Failures0

Tests

rm_execution

Test case:[libs/PNETCDF] Parallel Fortran binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:1.195 sec
FailedNone
None
Test case:[libs/PNETCDF] Parallel Fortran binary 2 runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.258 sec
FailedNone
None
Test case:[libs/PNETCDF] Parallel Fortran binary 3 runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:4.382 sec
FailedNone
None
Test case:[libs/PNETCDF] Parallel Fortran binary 4 runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.255 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration7.852 sec
Tests3
Failures0

Tests

rm_execution

Test case:[libs/PTScotch] dgraph_redist binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.265 sec
FailedNone
None
Test case:[libs/PTScotch] strat_par binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.26 sec
FailedNone
None
Test case:[libs/PTScotch] dgord binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.327 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.366 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/PTScotch] Verify PTSCOTCH module is loaded and matches rpm version (gnu14-mpich)
Outcome:Passed
Duration:0.187 sec
FailedNone
None
Test case:[libs/PTScotch] Verify PTSCOTCH_DIR is defined and directory exists (gnu14-mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PTScotch] Verify module PTSCOTCH_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PTScotch] Verify dynamic library available in PTSCOTCH_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PTScotch] Verify static library is not present in PTSCOTCH_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PTScotch] Verify module PTSCOTCH_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PTScotch] Verify header file is present in PTSCOTCH_INC (gnu14/mpich)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[libs/PTScotch] Verify module PTSCOTCH_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/PTScotch] Verify availability of dgscat binary (gnu14/mpich)
Outcome:Passed
Duration:0.035 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration10.022 sec
Tests3
Failures0

Tests

rm_execution

Test case:[libs/PTScotch] dgraph_redist binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.411 sec
FailedNone
None
Test case:[libs/PTScotch] strat_par binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:1.207 sec
FailedNone
None
Test case:[libs/PTScotch] dgord binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.404 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.377 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/PTScotch] Verify PTSCOTCH module is loaded and matches rpm version (gnu14-openmpi5)
Outcome:Passed
Duration:0.196 sec
FailedNone
None
Test case:[libs/PTScotch] Verify PTSCOTCH_DIR is defined and directory exists (gnu14-openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PTScotch] Verify module PTSCOTCH_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PTScotch] Verify dynamic library available in PTSCOTCH_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PTScotch] Verify static library is not present in PTSCOTCH_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PTScotch] Verify module PTSCOTCH_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PTScotch] Verify header file is present in PTSCOTCH_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PTScotch] Verify module PTSCOTCH_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/PTScotch] Verify availability of dgscat binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.034 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration6.78 sec
Tests3
Failures0

Tests

rm_execution

Test case:[libs/PTScotch] dgraph_redist binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:1.199 sec
FailedNone
None
Test case:[libs/PTScotch] strat_par binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.261 sec
FailedNone
None
Test case:[libs/PTScotch] dgord binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:3.32 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.353 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/PTScotch] Verify PTSCOTCH module is loaded and matches rpm version (gnu14-mvapich2)
Outcome:Passed
Duration:0.181 sec
FailedNone
None
Test case:[libs/PTScotch] Verify PTSCOTCH_DIR is defined and directory exists (gnu14-mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PTScotch] Verify module PTSCOTCH_LIB is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[libs/PTScotch] Verify dynamic library available in PTSCOTCH_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PTScotch] Verify static library is not present in PTSCOTCH_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PTScotch] Verify module PTSCOTCH_INC is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[libs/PTScotch] Verify header file is present in PTSCOTCH_INC (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PTScotch] Verify module PTSCOTCH_BIN is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PTScotch] Verify availability of dgscat binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.033 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration9.065 sec
Tests4
Failures0

Tests

rm_execution

Test case:[libs/ScaLAPACK/PCSCAEX] CPCGESV under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.267 sec
FailedNone
None
Test case:[libs/ScaLAPACK/PDSCAEX] DPCGESV under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.264 sec
FailedNone
None
Test case:[libs/ScaLAPACK/PSSCAEX] SPCGESV under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:1.202 sec
FailedNone
None
Test case:[libs/ScaLAPACK/PZSCAEX] ZPCGESV under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.332 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.265 sec
Tests5
Failures0

Tests

test_module

Test case:[libs/ScaLAPACK] Verify SCALAPACK module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.182 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify module SCALAPACK_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify module SCALAPACK_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify dynamic library available in SCALAPACK_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify static library is not present in SCALAPACK_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration13.369 sec
Tests4
Failures0

Tests

rm_execution

Test case:[libs/ScaLAPACK/PCSCAEX] CPCGESV under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.342 sec
FailedNone
None
Test case:[libs/ScaLAPACK/PDSCAEX] DPCGESV under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.342 sec
FailedNone
None
Test case:[libs/ScaLAPACK/PSSCAEX] SPCGESV under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.341 sec
FailedNone
None
Test case:[libs/ScaLAPACK/PZSCAEX] ZPCGESV under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.344 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.272 sec
Tests5
Failures0

Tests

test_module

Test case:[libs/ScaLAPACK] Verify SCALAPACK module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.188 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify module SCALAPACK_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify module SCALAPACK_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify dynamic library available in SCALAPACK_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify static library is not present in SCALAPACK_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration11.182 sec
Tests4
Failures0

Tests

rm_execution

Test case:[libs/ScaLAPACK/PCSCAEX] CPCGESV under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:1.199 sec
FailedNone
None
Test case:[libs/ScaLAPACK/PDSCAEX] DPCGESV under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:3.33 sec
FailedNone
None
Test case:[libs/ScaLAPACK/PSSCAEX] SPCGESV under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.264 sec
FailedNone
None
Test case:[libs/ScaLAPACK/PZSCAEX] ZPCGESV under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:4.389 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.259 sec
Tests5
Failures0

Tests

test_module

Test case:[libs/ScaLAPACK] Verify SCALAPACK module is loaded and matches rpm version (gnu14/mvapich2)
Outcome:Passed
Duration:0.177 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify module SCALAPACK_DIR is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify module SCALAPACK_LIB is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify dynamic library available in SCALAPACK_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify static library is not present in SCALAPACK_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.512 sec
Tests3
Failures0

Tests

rm_execution

Test case:[libs/scotch] graph_map binary runs under resource manager (slurm/gnu14/)
Outcome:Passed
Duration:0.181 sec
FailedNone
None
Test case:[libs/scotch] strat_seq binary runs under resource manager (slurm/gnu14/)
Outcome:Passed
Duration:0.163 sec
FailedNone
None
Test case:[libs/scotch] gout binary runs under resource manager (slurm/gnu14/)
Outcome:Passed
Duration:0.168 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.351 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/Scotch] Verify SCOTCH module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.178 sec
FailedNone
None
Test case:[libs/Scotch] Verify SCOTCH_DIR is defined and directory exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/Scotch] Verify module SCOTCH_LIB is defined and exists (gnu14/)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/Scotch] Verify dynamic library available in SCOTCH_LIB (gnu14/)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/Scotch] Verify static library is not present in SCOTCH_LIB (gnu14/)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[libs/Scotch] Verify module SCOTCH_INC is defined and exists (gnu14/)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/Scotch] Verify header file is present in SCOTCH_INC (gnu14/)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[libs/Scotch] Verify module SCOTCH_BIN is defined and exists
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/Scotch] Verify availability of gout binary
Outcome:Passed
Duration:0.032 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration10.205 sec
Tests4
Failures0

Tests

rm_execution

Test case:[libs/SLEPc] F90 SVD test binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.28 sec
FailedNone
None
Test case:[libs/SLEPc] C SVD of the Lauchli matrix binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.286 sec
FailedNone
None
Test case:[libs/SLEPc] F90 quadratic eigensystem with PEP object binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.358 sec
FailedNone
None
Test case:[libs/SLEPc] C nonsymmetric eignenproblem binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.281 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.326 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/slepc] Verify slepc module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.199 sec
FailedNone
None
Test case:[libs/slepc] Verify module SLEPC_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/slepc] Verify module SLEPC_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/slepc] Verify dynamic library available in SLEPC_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/slepc] Verify static library is not present in SLEPC_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/slepc] Verify module SLEPC_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/slepc] Verify header file is present in SLEPC_INC (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration6.984 sec
Tests4
Failures0

Tests

rm_execution

Test case:[libs/SLEPc] F90 SVD test binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:1.214 sec
FailedNone
None
Test case:[libs/SLEPc] C SVD of the Lauchli matrix binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.282 sec
FailedNone
None
Test case:[libs/SLEPc] F90 quadratic eigensystem with PEP object binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.276 sec
FailedNone
None
Test case:[libs/SLEPc] C nonsymmetric eignenproblem binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:1.212 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.323 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/slepc] Verify slepc module is loaded and matches rpm version (gnu14/mvapich2)
Outcome:Passed
Duration:0.199 sec
FailedNone
None
Test case:[libs/slepc] Verify module SLEPC_DIR is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/slepc] Verify module SLEPC_LIB is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/slepc] Verify dynamic library available in SLEPC_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/slepc] Verify static library is not present in SLEPC_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/slepc] Verify module SLEPC_INC is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/slepc] Verify header file is present in SLEPC_INC (gnu14/mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration15.611 sec
Tests4
Failures0

Tests

rm_execution

Test case:[libs/SLEPc] F90 SVD test binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.371 sec
FailedNone
None
Test case:[libs/SLEPc] C SVD of the Lauchli matrix binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.434 sec
FailedNone
None
Test case:[libs/SLEPc] F90 quadratic eigensystem with PEP object binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.368 sec
FailedNone
None
Test case:[libs/SLEPc] C nonsymmetric eignenproblem binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.438 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.338 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/slepc] Verify slepc module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.209 sec
FailedNone
None
Test case:[libs/slepc] Verify module SLEPC_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/slepc] Verify module SLEPC_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/slepc] Verify dynamic library available in SLEPC_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/slepc] Verify static library is not present in SLEPC_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/slepc] Verify module SLEPC_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/slepc] Verify header file is present in SLEPC_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.356 sec
Tests2
Failures0

Tests

rm_execution

Test case:[libs/SuperLU] C test runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.185 sec
FailedNone
None
Test case:[libs/SuperLU] F77 test runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.171 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.197 sec
Tests2
Failures0

Tests

test_module

Test case:[libs/SUPERLU] Verify SUPERLU module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.177 sec
FailedNone
None
Test case:[libs/SUPERLU] Verify SUPERLU_DIR is defined and directory exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration28.005 sec
Tests9
Failures1

Tests

rm_execution

Test case:[libs/superLU_dist] PDGSSVX with full (default) options (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.281 sec
FailedNone
None
Test case:[libs/superLU_dist] pdgssvx_ABglobal with full (default) options (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.279 sec
FailedNone
None
Test case:[libs/superLU_dist] vary RHS (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:3.349 sec
FailedNone
None
Test case:[libs/superLU_dist] vary RHS ABglobal (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.28 sec
FailedNone
None
Test case:[libs/superLU_dist] reuse permutation vector (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.28 sec
FailedNone
None
Test case:[libs/superLU_dist] reuse permutation vector ABglobal (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:3.348 sec
FailedNone
None
Test case:[libs/superLU_dist] reuse symbolic factorization (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:3.345 sec
FailedNone
None
Test case:[libs/superLU_dist] reuse symbolic factorization ABglobal (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:3.341 sec
FailedNone
None
Test case:[libs/superLU_dist] multi-grid (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:5.502 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 110)
  `run_mpi_binary ./pddrive4 "g20.rua" $NODES 10' failed
job script = /tmp/job.ohpc-test.2573
Batch job 409 submitted

Job 409 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./pddrive4 g20.rua (family=mvapich2)
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 1

[cli_1]: aborting job:
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 1

Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 6

[cli_0]: aborting job:
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 6

Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 5

[cli_2]: aborting job:
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 5

[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12

Test Suite: test_module

Results

Duration0.324 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/superlu_dist] Verify SUPERLU_DIST module is loaded and matches rpm version (gnu14/mvapich2)
Outcome:Passed
Duration:0.198 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify module SUPERLU_DIST_DIR is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify module SUPERLU_DIST_LIB is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify dynamic library available in SUPERLU_DIST_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify static library is not present in SUPERLU_DIST_LIB (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify module SUPERLU_DIST_INC is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify header file is present in SUPERLU_DIST_INC (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration33.465 sec
Tests9
Failures0

Tests

rm_execution

Test case:[libs/superLU_dist] PDGSSVX with full (default) options (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.432 sec
FailedNone
None
Test case:[libs/superLU_dist] pdgssvx_ABglobal with full (default) options (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.367 sec
FailedNone
None
Test case:[libs/superLU_dist] vary RHS (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.356 sec
FailedNone
None
Test case:[libs/superLU_dist] vary RHS ABglobal (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.364 sec
FailedNone
None
Test case:[libs/superLU_dist] reuse permutation vector (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.354 sec
FailedNone
None
Test case:[libs/superLU_dist] reuse permutation vector ABglobal (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.433 sec
FailedNone
None
Test case:[libs/superLU_dist] reuse symbolic factorization (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.363 sec
FailedNone
None
Test case:[libs/superLU_dist] reuse symbolic factorization ABglobal (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.434 sec
FailedNone
None
Test case:[libs/superLU_dist] multi-grid (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.362 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.334 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/superlu_dist] Verify SUPERLU_DIST module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.207 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify module SUPERLU_DIST_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify module SUPERLU_DIST_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify dynamic library available in SUPERLU_DIST_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify static library is not present in SUPERLU_DIST_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify module SUPERLU_DIST_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify header file is present in SUPERLU_DIST_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration25.897 sec
Tests9
Failures0

Tests

rm_execution

Test case:[libs/superLU_dist] PDGSSVX with full (default) options (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.279 sec
FailedNone
None
Test case:[libs/superLU_dist] pdgssvx_ABglobal with full (default) options (slurm/gnu14/mpich)
Outcome:Passed
Duration:1.216 sec
FailedNone
None
Test case:[libs/superLU_dist] vary RHS (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.355 sec
FailedNone
None
Test case:[libs/superLU_dist] vary RHS ABglobal (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.357 sec
FailedNone
None
Test case:[libs/superLU_dist] reuse permutation vector (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.281 sec
FailedNone
None
Test case:[libs/superLU_dist] reuse permutation vector ABglobal (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.351 sec
FailedNone
None
Test case:[libs/superLU_dist] reuse symbolic factorization (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.355 sec
FailedNone
None
Test case:[libs/superLU_dist] reuse symbolic factorization ABglobal (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.283 sec
FailedNone
None
Test case:[libs/superLU_dist] multi-grid (slurm/gnu14/mpich)
Outcome:Passed
Duration:4.42 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.324 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/superlu_dist] Verify SUPERLU_DIST module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.201 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify module SUPERLU_DIST_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify module SUPERLU_DIST_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify dynamic library available in SUPERLU_DIST_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify static library is not present in SUPERLU_DIST_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify module SUPERLU_DIST_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify header file is present in SUPERLU_DIST_INC (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: build

Results

Duration3.41 sec
Tests2
Failures0

Tests

build

Test case:[Trilinos] verify availability of Makefile.export.Trilinos (gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
deprecated with newer Trilinos
Test case:[Trilinos] build Trilinos executables (gnu14/openmpi5)
Outcome:Passed
Duration:3.41 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration24.573 sec
Tests9
Failures0

Tests

rm_execution

Test case:[libs/Trilinos] Kokkos-MemorySpace runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.365 sec
FailedNone
None
Test case:[libs/Trilinos] Tpetra-InitMPI runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:5.499 sec
FailedNone
None
Test case:[libs/Trilinos] Tpetra-DataRedistribution runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.422 sec
FailedNone
None
Test case:[libs/Trilinos] Epetra-DataRedistribution runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.363 sec
FailedNone
None
Test case:[libs/Trilinos] Epetra-Galeri runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skip Epetra-Galeri in 2.x
Test case:[libs/Trilinos] Epetra-Ifpack runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skip Epetra-Ifpack in 2.x
Test case:[libs/Trilinos] Teuchos-ParameterList runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:2.288 sec
FailedNone
None
Test case:[libs/Trilinos] Teuchos-LAPACK runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.347 sec
FailedNone
None
Test case:[libs/Trilinos] Teuchos-BLAS runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:2.289 sec
FailedNone
None

Test Suite: build

Results

Duration5.094 sec
Tests2
Failures0

Tests

build

Test case:[Trilinos] verify availability of Makefile.export.Trilinos (gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
deprecated with newer Trilinos
Test case:[Trilinos] build Trilinos executables (gnu14/mpich)
Outcome:Passed
Duration:5.094 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration18.085 sec
Tests9
Failures0

Tests

rm_execution

Test case:[libs/Trilinos] Kokkos-MemorySpace runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:1.205 sec
FailedNone
None
Test case:[libs/Trilinos] Tpetra-InitMPI runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.27 sec
FailedNone
None
Test case:[libs/Trilinos] Tpetra-DataRedistribution runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.346 sec
FailedNone
None
Test case:[libs/Trilinos] Epetra-DataRedistribution runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.349 sec
FailedNone
None
Test case:[libs/Trilinos] Epetra-Galeri runs under resource manager (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skip Epetra-Galeri in 2.x
Test case:[libs/Trilinos] Epetra-Ifpack runs under resource manager (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skip Epetra-Ifpack in 2.x
Test case:[libs/Trilinos] Teuchos-ParameterList runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.283 sec
FailedNone
None
Test case:[libs/Trilinos] Teuchos-LAPACK runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.353 sec
FailedNone
None
Test case:[libs/Trilinos] Teuchos-BLAS runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.279 sec
FailedNone
None

Test Suite: build

Results

Duration4.639 sec
Tests2
Failures0

Tests

build

Test case:[Trilinos] verify availability of Makefile.export.Trilinos (gnu14/mvapich2)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
deprecated with newer Trilinos
Test case:[Trilinos] build Trilinos executables (gnu14/mvapich2)
Outcome:Passed
Duration:4.639 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration25.569 sec
Tests9
Failures3

Tests

rm_execution

Test case:[libs/Trilinos] Kokkos-MemorySpace runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:1.203 sec
FailedNone
None
Test case:[libs/Trilinos] Tpetra-InitMPI runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:6.548 sec
FailedNone
(from function `run_mpi_binary' in file ../../../common/functions, line 414,
 in test file rm_execution, line 36)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.15716
Batch job 425 submitted

Job 425 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./lesson_tpetra_init.exe null (family=mvapich2)
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_0]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0
Test case:[libs/Trilinos] Tpetra-DataRedistribution runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:5.498 sec
FailedNone
(from function `run_mpi_binary' in file ../../../common/functions, line 414,
 in test file rm_execution, line 46)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.30834
Batch job 426 submitted

Job 426 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./lesson_tpetra_dataredist.exe null (family=mvapich2)
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_1]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_0]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0
Test case:[libs/Trilinos] Epetra-DataRedistribution runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:5.493 sec
FailedNone
(from function `run_mpi_binary' in file ../../../common/functions, line 414,
 in test file rm_execution, line 56)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.14965
Batch job 427 submitted

Job 427 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./lesson_epetra_dataredist.exe null (family=mvapich2)
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_0]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0
Test case:[libs/Trilinos] Epetra-Galeri runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skip Epetra-Galeri in 2.x
Test case:[libs/Trilinos] Epetra-Ifpack runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skip Epetra-Ifpack in 2.x
Test case:[libs/Trilinos] Teuchos-ParameterList runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.277 sec
FailedNone
None
Test case:[libs/Trilinos] Teuchos-LAPACK runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:1.209 sec
FailedNone
None
Test case:[libs/Trilinos] Teuchos-BLAS runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:3.341 sec
FailedNone
None

Test Suite: lmod_installed

Results

Duration0.044 sec
Tests1
Failures0

Tests

lmod_installed

Test case:[modules] Check if lmod RPM installed
Outcome:Passed
Duration:0.044 sec
FailedNone
None

Test Suite: interactive_commands

Results

Duration5.34 sec
Tests8
Failures0

Tests

interactive_commands

Test case:[modules] module purge
Outcome:Passed
Duration:0.15 sec
FailedNone
None
Test case:[modules] module list
Outcome:Passed
Duration:0.246 sec
FailedNone
None
Test case:[modules] module help
Outcome:Passed
Duration:0.275 sec
FailedNone
None
Test case:[modules] module load/unload
Outcome:Passed
Duration:2.574 sec
FailedNone
None
Test case:[modules] module whatis
Outcome:Passed
Duration:0.27 sec
FailedNone
None
Test case:[modules] module swap
Outcome:Passed
Duration:0.529 sec
FailedNone
None
Test case:[modules] path updated
Outcome:Passed
Duration:0.747 sec
FailedNone
None
Test case:[modules] module depends-on
Outcome:Passed
Duration:0.549 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.814 sec
Tests4
Failures0

Tests

rm_execution

Test case:[modules] env variable passes through ()
Outcome:Passed
Duration:0.417 sec
FailedNone
None
Test case:[modules] loaded module passes through ()
Outcome:Passed
Duration:0.498 sec
FailedNone
None
Test case:[modules] module commands available in RMS job ()
Outcome:Passed
Duration:0.473 sec
FailedNone
None
Test case:[modules] module load propagates thru RMS ()
Outcome:Passed
Duration:0.426 sec
FailedNone
None

Test Suite: build

Results

Duration2.03 sec
Tests3
Failures0

Tests

build

Test case:[MPI] build/execute C binary (gnu14/mpich)
Outcome:Passed
Duration:0.642 sec
FailedNone
None
Test case:[MPI] build/execute C++ binary (gnu14/mpich)
Outcome:Passed
Duration:0.75 sec
FailedNone
None
Test case:[MPI] build/execute F90 binary (gnu14/mpich)
Outcome:Passed
Duration:0.638 sec
FailedNone
None

Test Suite: man_page_check

Results

Duration0.102 sec
Tests1
Failures0

Tests

man_page_check

Test case:[MPI] mpicc man page availible (gnu14/mpich)
Outcome:Passed
Duration:0.102 sec
FailedNone
None

Test Suite: rm_execution_multi_host

Results

Duration7.667 sec
Tests3
Failures0

Tests

rm_execution_multi_host

Test case:[MPI] C binary runs on two nodes under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.254 sec
FailedNone
None
Test case:[MPI] C++ binary runs on two nodes under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.206 sec
FailedNone
None
Test case:[MPI] F90 binary runs on two nodes under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.207 sec
FailedNone
None

Test Suite: rm_execution_single_host

Results

Duration7.643 sec
Tests3
Failures0

Tests

rm_execution_single_host

Test case:[MPI] C binary runs on single node under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.199 sec
FailedNone
None
Test case:[MPI] C++ binary runs on single node under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.246 sec
FailedNone
None
Test case:[MPI] F90 binary runs on single node under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.198 sec
FailedNone
None

Test Suite: version_match

Results

Duration0.55 sec
Tests3
Failures0

Tests

version_match

Test case:[MPI] MPI module loaded (gnu14/mpich)
Outcome:Passed
Duration:0.132 sec
FailedNone
None
Test case:[MPI] MPI module version available (gnu14/mpich)
Outcome:Passed
Duration:0.132 sec
FailedNone
None
Test case:[MPI] mpicc, mpicxx, and mpif90 versions match module (gnu14/mpich)
Outcome:Passed
Duration:0.286 sec
FailedNone
None

Test Suite: build

Results

Duration3.564 sec
Tests3
Failures0

Tests

build

Test case:[MPI] build/execute C binary (gnu14/openmpi5)
Outcome:Passed
Duration:1.731 sec
FailedNone
None
Test case:[MPI] build/execute C++ binary (gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
C++ not available with openmpi5
Test case:[MPI] build/execute F90 binary (gnu14/openmpi5)
Outcome:Passed
Duration:1.833 sec
FailedNone
None

Test Suite: man_page_check

Results

Duration0.093 sec
Tests1
Failures0

Tests

man_page_check

Test case:[MPI] mpicc man page availible (gnu14/openmpi5)
Outcome:Passed
Duration:0.093 sec
FailedNone
None

Test Suite: rm_execution_multi_host

Results

Duration7.578 sec
Tests3
Failures0

Tests

rm_execution_multi_host

Test case:[MPI] C binary runs on two nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.263 sec
FailedNone
None
Test case:[MPI] C++ binary runs on two nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
C++ not available with openmpi5
Test case:[MPI] F90 binary runs on two nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.315 sec
FailedNone
None

Test Suite: rm_execution_single_host

Results

Duration7.564 sec
Tests3
Failures0

Tests

rm_execution_single_host

Test case:[MPI] C binary runs on single node under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.254 sec
FailedNone
None
Test case:[MPI] C++ binary runs on single node under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
C++ not available with openmpi5
Test case:[MPI] F90 binary runs on single node under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.31 sec
FailedNone
None

Test Suite: version_match

Results

Duration0.598 sec
Tests3
Failures0

Tests

version_match

Test case:[MPI] MPI module loaded (gnu14/openmpi5)
Outcome:Passed
Duration:0.136 sec
FailedNone
None
Test case:[MPI] MPI module version available (gnu14/openmpi5)
Outcome:Passed
Duration:0.136 sec
FailedNone
None
Test case:[MPI] mpicc, mpicxx, and mpif90 versions match module (gnu14/openmpi5)
Outcome:Passed
Duration:0.326 sec
FailedNone
None

Test Suite: build

Results

Duration1.165 sec
Tests3
Failures0

Tests

build

Test case:[MPI] build/execute C binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.362 sec
FailedNone
None
Test case:[MPI] build/execute C++ binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.449 sec
FailedNone
None
Test case:[MPI] build/execute F90 binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.354 sec
FailedNone
None

Test Suite: man_page_check

Results

Duration0.087 sec
Tests1
Failures0

Tests

man_page_check

Test case:[MPI] mpicc man page availible (gnu14/mvapich2)
Outcome:Passed
Duration:0.087 sec
FailedNone
None

Test Suite: rm_execution_multi_host

Results

Duration16.81 sec
Tests3
Failures3

Tests

rm_execution_multi_host

Test case:[MPI] C binary runs on two nodes under resource manager (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:5.548 sec
FailedNone
(from function `assert_success' in file ../../common/test_helper_functions.bash, line 58,
 in test file rm_execution_multi_host, line 24)
  `assert_success' failed

-- command failed --
status : 1
output (46 lines):
  job script = /tmp/job.ohpc-test.32125
  Batch job 444 submitted

  Job 444 failed...
  Reason=NonZeroExitCode

  [prun] Master compute host = c1
  [prun] Resource manager = slurm
  [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./C_test 8 (family=mvapich2)
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

  [cli_0]: aborting job:
  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

  [cli_1]: aborting job:
  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
--
Test case:[MPI] C++ binary runs on two nodes under resource manager (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:5.622 sec
FailedNone
(from function `assert_success' in file ../../common/test_helper_functions.bash, line 58,
 in test file rm_execution_multi_host, line 36)
  `assert_success' failed

-- command failed --
status : 1
output (80 lines):
  job script = /tmp/job.ohpc-test.8917
  Batch job 445 submitted

  Job 445 failed...
  Reason=NonZeroExitCode

  [prun] Master compute host = c1
  [prun] Resource manager = slurm
  [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./CXX_test 8 (family=mvapich2)
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

  [cli_0]: aborting job:
  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

  [cli_1]: aborting job:
  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

  [cli_2]: aborting job:
  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

  [cli_3]: aborting job:
  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0
--
Test case:[MPI] F90 binary runs on two nodes under resource manager (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:5.64 sec
FailedNone
(from function `assert_success' in file ../../common/test_helper_functions.bash, line 58,
 in test file rm_execution_multi_host, line 45)
  `assert_success' failed

-- command failed --
status : 1
output (80 lines):
  job script = /tmp/job.ohpc-test.9682
  Batch job 446 submitted

  Job 446 failed...
  Reason=NonZeroExitCode

  [prun] Master compute host = c1
  [prun] Resource manager = slurm
  [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./F90_test 8 (family=mvapich2)
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
  [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

  [cli_0]: aborting job:
  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

  [cli_2]: aborting job:
  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

  [cli_3]: aborting job:
  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

  [cli_1]: aborting job:
  Fatal error in MPI_Init:
  Other MPI error, error stack:
  MPIR_Init_thread(493)....:
  MPID_Init(419)...........: channel initialization failed
  MPIDI_CH3_Init(601)......:
  MPIDI_CH3I_RDMA_init(446):
  rdma_iba_hca_init(2003)..: Failed to create qp for rank 0
--

Test Suite: rm_execution_single_host

Results

Duration7.621 sec
Tests3
Failures0

Tests

rm_execution_single_host

Test case:[MPI] C binary runs on single node under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.19 sec
FailedNone
None
Test case:[MPI] C++ binary runs on single node under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:1.144 sec
FailedNone
None
Test case:[MPI] F90 binary runs on single node under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:4.287 sec
FailedNone
None

Test Suite: version_match

Results

Duration0.423 sec
Tests3
Failures0

Tests

version_match

Test case:[MPI] MPI module loaded (gnu14/mvapich2)
Outcome:Passed
Duration:0.129 sec
FailedNone
None
Test case:[MPI] MPI module version available (gnu14/mvapich2)
Outcome:Passed
Duration:0.128 sec
FailedNone
None
Test case:[MPI] mpicc, mpicxx, and mpif90 versions match module (gnu14/mvapich2)
Outcome:Passed
Duration:0.166 sec
FailedNone
None

Test Suite: conman

Results

Duration0.15 sec
Tests3
Failures0

Tests

conman

Test case:[ConMan] Verify conman binary available
Outcome:Passed
Duration:0.041 sec
FailedNone
None
Test case:[ConMan] Verify rpm version matches binary
Outcome:Passed
Duration:0.057 sec
FailedNone
None
Test case:[ConMan] Verify man page availability
Outcome:Passed
Duration:0.052 sec
FailedNone
None

Test Suite: warewulf-ipmi

Results

Duration0.23 sec
Tests1
Failures0

Tests

warewulf-ipmi

Test case:[warewulf-ipmi] ipmitool lanplus protocol
Outcome:Passed
Duration:0.23 sec
FailedNone
None

Test Suite: test_module

Results

Duration1.996 sec
Tests6
Failures0

Tests

test_module

Test case:[Dimemas] Verify dimemas module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.203 sec
FailedNone
None
Test case:[Dimemas] Verify module DIMEMAS_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[Dimemas] Verify module DIMEMAS_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.031 sec
FailedNone
None
Test case:[Dimemas] Verify availability of prv2dim binary (gnu14/mpich)
Outcome:Passed
Duration:0.044 sec
FailedNone
None
Test case:[Dimemas] Verify availability of Dimemas binary (gnu14/mpich)
Outcome:Passed
Duration:0.044 sec
FailedNone
None
Test case:[Dimemas] Run Dimemas simulation (gnu14/mpich)
Outcome:Passed
Duration:1.641 sec
FailedNone
None

Test Suite: test_module

Results

Duration2.001 sec
Tests6
Failures0

Tests

test_module

Test case:[Dimemas] Verify dimemas module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.203 sec
FailedNone
None
Test case:[Dimemas] Verify module DIMEMAS_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.032 sec
FailedNone
None
Test case:[Dimemas] Verify module DIMEMAS_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[Dimemas] Verify availability of prv2dim binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.046 sec
FailedNone
None
Test case:[Dimemas] Verify availability of Dimemas binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.046 sec
FailedNone
None
Test case:[Dimemas] Run Dimemas simulation (gnu14/openmpi5)
Outcome:Passed
Duration:1.639 sec
FailedNone
None

Test Suite: test_module

Results

Duration1.982 sec
Tests6
Failures0

Tests

test_module

Test case:[Dimemas] Verify dimemas module is loaded and matches rpm version (gnu14/mvapich2)
Outcome:Passed
Duration:0.193 sec
FailedNone
None
Test case:[Dimemas] Verify module DIMEMAS_DIR is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.031 sec
FailedNone
None
Test case:[Dimemas] Verify module DIMEMAS_BIN is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.031 sec
FailedNone
None
Test case:[Dimemas] Verify availability of prv2dim binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.046 sec
FailedNone
None
Test case:[Dimemas] Verify availability of Dimemas binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.044 sec
FailedNone
None
Test case:[Dimemas] Run Dimemas simulation (gnu14/mvapich2)
Outcome:Passed
Duration:1.637 sec
FailedNone
None

Test Suite: test_imb_mpi1

Results

Duration3.328 sec
Tests1
Failures0

Tests

test_imb_mpi1

Test case:[Libs/IMB] run IMB-MPI1 on 2 nodes under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.328 sec
FailedNone
None

Test Suite: test_imb_mpi2

Results

Duration6.643 sec
Tests2
Failures0

Tests

test_imb_mpi2

Test case:[Libs/IMB] run IMB-EXT on 2 nodes under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.323 sec
FailedNone
None
Test case:[Libs/IMB] run IMB-IO on 2 nodes under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.32 sec
FailedNone
None

Test Suite: test_imb_mpi3

Results

Duration2.261 sec
Tests2
Failures0

Tests

test_imb_mpi3

Test case:[Libs/IMB] run IMB-NBC on 2 nodes under resource manager (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[Libs/IMB] run IMB-RMA on 2 nodes under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.261 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.227 sec
Tests3
Failures0

Tests

test_module

Test case:[IMB] Verify IMB module is loaded and matches rpm version (gnu14-mpich)
Outcome:Passed
Duration:0.183 sec
FailedNone
None
Test case:[IMB] Verify IMB_DIR is defined and directory exists (gnu14-mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[IMB] Verify executables are present in IMB_DIR/bin (gnu14-mpich)
Outcome:Passed
Duration:0.024 sec
FailedNone
None

Test Suite: test_imb_mpi1

Results

Duration4.404 sec
Tests1
Failures0

Tests

test_imb_mpi1

Test case:[Libs/IMB] run IMB-MPI1 on 2 nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.404 sec
FailedNone
None

Test Suite: test_imb_mpi2

Results

Duration10.953 sec
Tests2
Failures1

Tests

test_imb_mpi2

Test case:[Libs/IMB] run IMB-EXT on 2 nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:5.483 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file test_imb_mpi2, line 41)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.17280
Batch job 452 submitted

Job 452 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = srun --mpi=pmix -- /opt/ohpc/pub/libs/gnu14/openmpi5/imb/2021.3/bin/IMB-EXT -npmin 100 -msglog 1:4 Window (family=openmpi5)
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 22
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 95
[c1][[6025,0],0][base/btl_base_am_rdma.c:959:am_rdma_process_rdma] BTL is not compatible with active-message RDMA
[c1:18407] *** Process received signal ***
[c1:18407] Signal: Aborted (6)
[c1:18407] Signal code:  (-6)
[c1:18407] [ 0] /lib64/libc.so.6(+0x3e6f0)[0x7f46f1d386f0]
[c1:18407] [ 1] /lib64/libc.so.6(+0x8b94c)[0x7f46f1d8594c]
[c1:18407] [ 2] /lib64/libc.so.6(raise+0x16)[0x7f46f1d38646]
[c1:18407] [ 3] /lib64/libc.so.6(abort+0xd3)[0x7f46f1d227f3]
[c1:18407] [ 4] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libopen-pal.so.80(+0xc531c)[0x7f46f1c9131c]
[c1:18407] [ 5] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libopen-pal.so.80(mca_btl_uct_am_handler+0x6f)[0x7f46f1c9413f]
[c1:18407] [ 6] /opt/ohpc/pub/mpi/ucx-ohpc/1.17.0/lib/ucx/libuct_ib.so.0(+0x5732d)[0x7f46f014a32d]
[c1:18407] [ 7] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libopen-pal.so.80(+0xc693b)[0x7f46f1c9293b]
[c1:18407] [ 8] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libopen-pal.so.80(+0xc6d4c)[0x7f46f1c92d4c]
[c1:18407] [ 9] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libopen-pal.so.80(opal_progress+0x34)[0x7f46f1bf5aa4]
[c1:18407] [10] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libmpi.so.40(ompi_request_default_wait+0x130)[0x7f46f231efd0]
[c1:18407] [11] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libmpi.so.40(ompi_coll_base_barrier_intra_recursivedoubling+0xe0)[0x7f46f239dfb0]
[c1:18407] [12] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libmpi.so.40(mca_coll_han_barrier_intra_simple+0x4f)[0x7f46f23e590f]
[c1:18407] [13] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libmpi.so.40(ompi_osc_rdma_fence_atomic+0xb1)[0x7f46f24eb3e1]
[c1:18407] [14] /opt/ohpc/pub/mpi/openmpi5-gnu14/5.0.5/lib/libmpi.so.40(PMPI_Win_fence+0x8b)[0x7f46f236ba5b]
[c1:18407] [15] /opt/ohpc/pub/libs/gnu14/openmpi5/imb/2021.3/bin/IMB-EXT[0x447a54]
[c1:18407] [16] /opt/ohpc/pub/libs/gnu14/openmpi5/imb/2021.3/bin/IMB-EXT[0x43372a]
[c1:18407] [17] /opt/ohpc/pub/libs/gnu14/openmpi5/imb/2021.3/bin/IMB-EXT[0x434a80]
[c1:18407] [18] /opt/ohpc/pub/libs/gnu14/openmpi5/imb/2021.3/bin/IMB-EXT[0x4070aa]
[c1:18407] [19] /lib64/libc.so.6(+0x29590)[0x7f46f1d23590]
[c1:18407] [20] /lib64/libc.so.6(__libc_start_main+0x80)[0x7f46f1d23640]
[c1:18407] [21] /opt/ohpc/pub/libs/gnu14/openmpi5/imb/2021.3/bin/IMB-EXT[0x405975]
[c1:18407] *** End of error message ***
srun: error: c1: task 0: Aborted (core dumped)
slurmstepd-c1: error:  mpi/pmix_v4: _errhandler: c1 [0]: pmixp_client_v2.c:211: Error handler invoked: status = -61, source = [slurm.pmix.452.0:0]
slurmstepd-c1: error: *** STEP 452.0 ON c1 CANCELLED AT 2024-11-03T19:52:55 ***
srun: Job step aborted: Waiting up to 32 seconds for job step to finish.
srun: error: c1: task 1: Killed
srun: error: c2: tasks 2-3: Killed
Test case:[Libs/IMB] run IMB-IO on 2 nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:5.47 sec
FailedNone
None

Test Suite: test_imb_mpi3

Results

Duration3.336 sec
Tests2
Failures0

Tests

test_imb_mpi3

Test case:[Libs/IMB] run IMB-NBC on 2 nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[Libs/IMB] run IMB-RMA on 2 nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.336 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.239 sec
Tests3
Failures0

Tests

test_module

Test case:[IMB] Verify IMB module is loaded and matches rpm version (gnu14-openmpi5)
Outcome:Passed
Duration:0.192 sec
FailedNone
None
Test case:[IMB] Verify IMB_DIR is defined and directory exists (gnu14-openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[IMB] Verify executables are present in IMB_DIR/bin (gnu14-openmpi5)
Outcome:Passed
Duration:0.027 sec
FailedNone
None

Test Suite: test_imb_mpi1

Results

Duration5.459 sec
Tests1
Failures1

Tests

test_imb_mpi1

Test case:[Libs/IMB] run IMB-MPI1 on 2 nodes under resource manager (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:5.459 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file test_imb_mpi1, line 38)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.25400
Batch job 455 submitted

Job 455 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm /opt/ohpc/pub/libs/gnu14/mvapich2/imb/2021.3/bin/IMB-MPI1 -off_cache -1 -time 1.5 -npmin 100 -msglog 1:4 PingPong Sendrecv Bcast Allreduce (family=mvapich2)
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_8]: aborting job:
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12

Test Suite: test_imb_mpi2

Results

Duration10.92 sec
Tests2
Failures2

Tests

test_imb_mpi2

Test case:[Libs/IMB] run IMB-EXT on 2 nodes under resource manager (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:5.463 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file test_imb_mpi2, line 41)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.12360
Batch job 456 submitted

Job 456 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm /opt/ohpc/pub/libs/gnu14/mvapich2/imb/2021.3/bin/IMB-EXT -npmin 100 -msglog 1:4 Window (family=mvapich2)
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_2]: aborting job:
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_0]: aborting job:
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_3]: aborting job:
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_1]: aborting job:
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0
Test case:[Libs/IMB] run IMB-IO on 2 nodes under resource manager (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:5.457 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file test_imb_mpi2, line 53)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.31943
Batch job 457 submitted

Job 457 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm /opt/ohpc/pub/libs/gnu14/mvapich2/imb/2021.3/bin/IMB-IO -npmin 100 Open_Close (family=mvapich2)
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_0]: aborting job:
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_1]: aborting job:
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12

Test Suite: test_imb_mpi3

Results

Duration5.461 sec
Tests2
Failures1

Tests

test_imb_mpi3

Test case:[Libs/IMB] run IMB-NBC on 2 nodes under resource manager (slurm/gnu14/mvapich2)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[Libs/IMB] run IMB-RMA on 2 nodes under resource manager (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:5.461 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file test_imb_mpi3, line 56)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.11665
Batch job 458 submitted

Job 458 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm /opt/ohpc/pub/libs/gnu14/mvapich2/imb/2021.3/bin/IMB-RMA -npmin 100 -msglog 1:4 Unidir_put Unidir_get (family=mvapich2)
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_4]: aborting job:
Fatal error in PMPI_Init_thread:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

Test Suite: test_module

Results

Duration0.226 sec
Tests3
Failures0

Tests

test_module

Test case:[IMB] Verify IMB module is loaded and matches rpm version (gnu14-mvapich2)
Outcome:Passed
Duration:0.181 sec
FailedNone
None
Test case:[IMB] Verify IMB_DIR is defined and directory exists (gnu14-mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[IMB] Verify executables are present in IMB_DIR/bin (gnu14-mvapich2)
Outcome:Passed
Duration:0.025 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration11.974 sec
Tests2
Failures0

Tests

rm_execution

Test case:[OMB] run osu_bw on 2 nodes under resource manager (/gnu14/mpich)
Outcome:Passed
Duration:2.265 sec
FailedNone
None
Test case:[OMB] run osu_latency on 2 nodes under resource manager (/gnu14/mpich)
Outcome:Passed
Duration:9.709 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.367 sec
Tests7
Failures0

Tests

test_module

Test case:[OMB] Verify OMB module is loaded and matches rpm version (gnu14-mpich)
Outcome:Passed
Duration:0.18 sec
FailedNone
None
Test case:[OMB] Verify OMB_DIR is defined and directory exists (gnu14-mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[OMB] Verify osu_bw binary is available (gnu14-mpich)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[OMB] Verify osu_latency binary is available (gnu14-mpich)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[OMB] Verify osu_allgather binary is available (gnu14-mpich)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[OMB] Verify osu_get_bw binary is available (gnu14-mpich)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[OMB] Verify osu_put_latency binary is available (gnu14-mpich)
Outcome:Passed
Duration:0.034 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration8.824 sec
Tests2
Failures0

Tests

rm_execution

Test case:[OMB] run osu_bw on 2 nodes under resource manager (/gnu14/openmpi5)
Outcome:Passed
Duration:3.341 sec
FailedNone
None
Test case:[OMB] run osu_latency on 2 nodes under resource manager (/gnu14/openmpi5)
Outcome:Passed
Duration:5.483 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.381 sec
Tests7
Failures0

Tests

test_module

Test case:[OMB] Verify OMB module is loaded and matches rpm version (gnu14-openmpi5)
Outcome:Passed
Duration:0.192 sec
FailedNone
None
Test case:[OMB] Verify OMB_DIR is defined and directory exists (gnu14-openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[OMB] Verify osu_bw binary is available (gnu14-openmpi5)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[OMB] Verify osu_latency binary is available (gnu14-openmpi5)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[OMB] Verify osu_allgather binary is available (gnu14-openmpi5)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[OMB] Verify osu_get_bw binary is available (gnu14-openmpi5)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[OMB] Verify osu_put_latency binary is available (gnu14-openmpi5)
Outcome:Passed
Duration:0.034 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration15.188 sec
Tests2
Failures2

Tests

rm_execution

Test case:[OMB] run osu_bw on 2 nodes under resource manager (/gnu14/mvapich2)
Outcome:Failed
Duration:5.466 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 22)
  `run_mpi_binary -t "$CMD_TIMEOUT" "$EXE" "-m 512" 2 2' failed
job script = /tmp/job.ohpc-test.23214
Batch job 463 submitted

Job 463 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm osu_bw -m 512 (family=mvapich2)
[c1:mpi_rank_0][rdma_find_network_type] Unable to find the numa process is bound to. Disabling process placement aware hca mapping.
[c2:mpi_rank_1][rdma_find_network_type] Unable to find the numa process is bound to. Disabling process placement aware hca mapping.
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_0]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0
Test case:[OMB] run osu_latency on 2 nodes under resource manager (/gnu14/mvapich2)
Outcome:Failed
Duration:9.722 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 35)
  `run_mpi_binary -t "$CMD_TIMEOUT" "$EXE" "-m $MESSAGE_SIZE" 2 2' failed
job script = /tmp/job.ohpc-test.3803
Batch job 464 submitted

Job 464 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm osu_latency -m 512 (family=mvapich2)
[c2:mpi_rank_1][rdma_find_network_type] Unable to find the numa process is bound to. Disabling process placement aware hca mapping.
[c1:mpi_rank_0][rdma_find_network_type] Unable to find the numa process is bound to. Disabling process placement aware hca mapping.
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_0]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

Test Suite: test_module

Results

Duration0.367 sec
Tests7
Failures0

Tests

test_module

Test case:[OMB] Verify OMB module is loaded and matches rpm version (gnu14-mvapich2)
Outcome:Passed
Duration:0.18 sec
FailedNone
None
Test case:[OMB] Verify OMB_DIR is defined and directory exists (gnu14-mvapich2)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[OMB] Verify osu_bw binary is available (gnu14-mvapich2)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[OMB] Verify osu_latency binary is available (gnu14-mvapich2)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[OMB] Verify osu_allgather binary is available (gnu14-mvapich2)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[OMB] Verify osu_get_bw binary is available (gnu14-mvapich2)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[OMB] Verify osu_put_latency binary is available (gnu14-mvapich2)
Outcome:Passed
Duration:0.034 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration17.566 sec
Tests3
Failures0

Tests

rm_execution

Test case:[perf-tools/Scalasca] MPI C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:10.303 sec
FailedNone
None
Test case:[perf-tools/Scalasca] MPI C++ binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:4.962 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Serial C OpenMP binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.301 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.51 sec
Tests11
Failures0

Tests

test_module

Test case:[perf-tools/Scalasca] Verify scalasca module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.189 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify module SCALASCA_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify module SCALASCA_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scalasca binary (gnu14/mpich)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scan binary (gnu14/mpich)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.mpi binary (gnu14/mpich)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.omp binary (gnu14/mpich)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.hyb binary (gnu14/mpich)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.ser binary (gnu14/mpich)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of square binary (gnu14/mpich)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of user guide for scalasca (gnu14/mpich)
Outcome:Passed
Duration:0.034 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration23.0 sec
Tests3
Failures0

Tests

rm_execution

Test case:[perf-tools/Scalasca] MPI C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:12.469 sec
FailedNone
None
Test case:[perf-tools/Scalasca] MPI C++ binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:8.194 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Serial C OpenMP binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:2.337 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.526 sec
Tests11
Failures0

Tests

test_module

Test case:[perf-tools/Scalasca] Verify scalasca module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.2 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify module SCALASCA_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify module SCALASCA_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scalasca binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scan binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.mpi binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.omp binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.hyb binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.ser binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of square binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.036 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of user guide for scalasca (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration16.592 sec
Tests3
Failures2

Tests

rm_execution

Test case:[perf-tools/Scalasca] MPI C binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:8.74 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 24)
  `run_mpi_binary -s 1 ./mpi/C_mpi_test $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.25409
Batch job 500 submitted

Job 500 failed...
Reason=NonZeroExitCode

S=C=A=N: Scalasca 2.6.1 trace collection and analysis
S=C=A=N: ./scorep_C_mpi_test_8x2_trace experiment archive
S=C=A=N: Sun Nov  3 19:59:03 2024: Collect start
/opt/ohpc/pub/utils/prun/2.2/prun ./mpi/C_mpi_test 8
[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./mpi/C_mpi_test 8 (family=mvapich2)
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_0]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_1]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
S=C=A=N: Sun Nov  3 19:59:06 2024: Collect done (status=0) 3s
Abort: missing experiment archive ./scorep_C_mpi_test_8x2_trace
Test case:[perf-tools/Scalasca] MPI C++ binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:5.532 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 38)
  `run_mpi_binary -s 1 ./mpi/CXX_mpi_test $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.18428
Batch job 501 submitted

Job 501 failed...
Reason=NonZeroExitCode

S=C=A=N: Scalasca 2.6.1 trace collection and analysis
S=C=A=N: ./scorep_CXX_mpi_test_8x2_trace experiment archive
S=C=A=N: Sun Nov  3 19:59:11 2024: Collect start
/opt/ohpc/pub/utils/prun/2.2/prun ./mpi/CXX_mpi_test 8
[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./mpi/CXX_mpi_test 8 (family=mvapich2)
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_1]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
S=C=A=N: Sun Nov  3 19:59:12 2024: Collect done (status=0) 1s
Abort: missing experiment archive ./scorep_CXX_mpi_test_8x2_trace
Test case:[perf-tools/Scalasca] Serial C OpenMP binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:2.32 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.515 sec
Tests11
Failures0

Tests

test_module

Test case:[perf-tools/Scalasca] Verify scalasca module is loaded and matches rpm version (gnu14/mvapich2)
Outcome:Passed
Duration:0.19 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify module SCALASCA_DIR is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify module SCALASCA_BIN is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scalasca binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scan binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.mpi binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.omp binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.hyb binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.ser binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of square binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of user guide for scalasca (gnu14/mvapich2)
Outcome:Passed
Duration:0.035 sec
FailedNone
None

Test Suite: instrumenter_test

Results

Duration2.001 sec
Tests9
Failures0

Tests

instrumenter_test

Test case:[perf-tools/Score-P] MPI C binary includes expected Score-P instrumenter symbols (/gnu14/mpich)
Outcome:Passed
Duration:0.162 sec
FailedNone
None
Test case:[perf-tools/Score-P] MPI C++ binary includes expected Score-P instrumenter symbols (/gnu14/mpich)
Outcome:Passed
Duration:0.161 sec
FailedNone
None
Test case:[perf-tools/Score-P] MPI Fortran binary includes expected Score-P instrumenter symbols (/gnu14/mpich)
Outcome:Passed
Duration:0.163 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial C OpenMP binary includes expected Score-P instrumenter symbols (/gnu14/mpich)
Outcome:Passed
Duration:0.179 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial C++ OpenMP binary includes expected Score-P instrumenter symbols (/gnu14/mpich)
Outcome:Passed
Duration:0.18 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial Fortran OpenMP binary includes expected Score-P instrumenter symbols (/gnu14/mpich)
Outcome:Passed
Duration:0.179 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C binary includes expected Score-P instrumenter symbols (/gnu14/mpich)
Outcome:Passed
Duration:0.325 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C++ binary includes expected Score-P instrumenter symbols (/gnu14/mpich)
Outcome:Passed
Duration:0.328 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) Fortran binary includes expected Score-P instrumenter symbols (/gnu14/mpich)
Outcome:Passed
Duration:0.324 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration24.296 sec
Tests9
Failures0

Tests

rm_execution

Test case:[perf-tools/Score-P] MPI C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.49 sec
FailedNone
None
Test case:[perf-tools/Score-P] MPI C++ binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.497 sec
FailedNone
None
Test case:[perf-tools/Score-P] MPI Fortran binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:3.791 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial C OpenMP binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.42 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial C++ OpenMP binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.394 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial Fortran OpenMP binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.39 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.53 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C++ binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.519 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) Fortran binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:9.265 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.733 sec
Tests14
Failures0

Tests

test_module

Test case:[perf-tools/Score-P] Verify scorep module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.19 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify module SCOREP_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify module SCOREP_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep binary (gnu14/mpich)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-backend-info binary (gnu14/mpich)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-config binary (gnu14/mpich)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-info binary (gnu14/mpich)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-preload-init binary (gnu14/mpich)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-score binary (gnu14/mpich)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-wrapper binary (gnu14/mpich)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep wrapper binaries (gnu14/mpich)
Outcome:Passed
Duration:0.153 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of user guide for scorep (gnu14/mpich)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of OPEN_ISSUES for scorep (gnu14/mpich)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of COPYING for scorep (gnu14/mpich)
Outcome:Passed
Duration:0.034 sec
FailedNone
None

Test Suite: instrumenter_test

Results

Duration2.116 sec
Tests9
Failures0

Tests

instrumenter_test

Test case:[perf-tools/Score-P] MPI C binary includes expected Score-P instrumenter symbols (/gnu14/openmpi5)
Outcome:Passed
Duration:0.171 sec
FailedNone
None
Test case:[perf-tools/Score-P] MPI C++ binary includes expected Score-P instrumenter symbols (/gnu14/openmpi5)
Outcome:Passed
Duration:0.171 sec
FailedNone
None
Test case:[perf-tools/Score-P] MPI Fortran binary includes expected Score-P instrumenter symbols (/gnu14/openmpi5)
Outcome:Passed
Duration:0.17 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial C OpenMP binary includes expected Score-P instrumenter symbols (/gnu14/openmpi5)
Outcome:Passed
Duration:0.185 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial C++ OpenMP binary includes expected Score-P instrumenter symbols (/gnu14/openmpi5)
Outcome:Passed
Duration:0.183 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial Fortran OpenMP binary includes expected Score-P instrumenter symbols (/gnu14/openmpi5)
Outcome:Passed
Duration:0.185 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C binary includes expected Score-P instrumenter symbols (/gnu14/openmpi5)
Outcome:Passed
Duration:0.35 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C++ binary includes expected Score-P instrumenter symbols (/gnu14/openmpi5)
Outcome:Passed
Duration:0.352 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) Fortran binary includes expected Score-P instrumenter symbols (/gnu14/openmpi5)
Outcome:Passed
Duration:0.349 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration35.001 sec
Tests9
Failures0

Tests

rm_execution

Test case:[perf-tools/Score-P] MPI C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.654 sec
FailedNone
None
Test case:[perf-tools/Score-P] MPI C++ binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.662 sec
FailedNone
None
Test case:[perf-tools/Score-P] MPI Fortran binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.897 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial C OpenMP binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.42 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial C++ OpenMP binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.421 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial Fortran OpenMP binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.399 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.695 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C++ binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.683 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) Fortran binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:10.17 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.787 sec
Tests14
Failures0

Tests

test_module

Test case:[perf-tools/Score-P] Verify scorep module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.196 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify module SCOREP_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify module SCOREP_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-backend-info binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-config binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-info binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-preload-init binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-score binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-wrapper binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep wrapper binaries (gnu14/openmpi5)
Outcome:Passed
Duration:0.2 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of user guide for scorep (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of OPEN_ISSUES for scorep (gnu14/openmpi5)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of COPYING for scorep (gnu14/openmpi5)
Outcome:Passed
Duration:0.033 sec
FailedNone
None

Test Suite: instrumenter_test

Results

Duration1.984 sec
Tests9
Failures0

Tests

instrumenter_test

Test case:[perf-tools/Score-P] MPI C binary includes expected Score-P instrumenter symbols (/gnu14/mvapich2)
Outcome:Passed
Duration:0.158 sec
FailedNone
None
Test case:[perf-tools/Score-P] MPI C++ binary includes expected Score-P instrumenter symbols (/gnu14/mvapich2)
Outcome:Passed
Duration:0.162 sec
FailedNone
None
Test case:[perf-tools/Score-P] MPI Fortran binary includes expected Score-P instrumenter symbols (/gnu14/mvapich2)
Outcome:Passed
Duration:0.156 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial C OpenMP binary includes expected Score-P instrumenter symbols (/gnu14/mvapich2)
Outcome:Passed
Duration:0.178 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial C++ OpenMP binary includes expected Score-P instrumenter symbols (/gnu14/mvapich2)
Outcome:Passed
Duration:0.181 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial Fortran OpenMP binary includes expected Score-P instrumenter symbols (/gnu14/mvapich2)
Outcome:Passed
Duration:0.18 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C binary includes expected Score-P instrumenter symbols (/gnu14/mvapich2)
Outcome:Passed
Duration:0.323 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C++ binary includes expected Score-P instrumenter symbols (/gnu14/mvapich2)
Outcome:Passed
Duration:0.327 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) Fortran binary includes expected Score-P instrumenter symbols (/gnu14/mvapich2)
Outcome:Passed
Duration:0.319 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration34.356 sec
Tests9
Failures6

Tests

rm_execution

Test case:[perf-tools/Score-P] MPI C binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:5.522 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 45)
  `run_mpi_binary ./mpi/main_mpi_c $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.22036
Batch job 483 submitted

Job 483 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./mpi/main_mpi_c 8 (family=mvapich2)
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_1]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Test case:[perf-tools/Score-P] MPI C++ binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:5.521 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 62)
  `run_mpi_binary ./mpi/main_mpi_cxx $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.10191
Batch job 484 submitted

Job 484 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./mpi/main_mpi_cxx 8 (family=mvapich2)
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_0]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Test case:[perf-tools/Score-P] MPI Fortran binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:5.515 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 79)
  `run_mpi_binary ./mpi/main_mpi_fort $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.21879
Batch job 485 submitted

Job 485 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./mpi/main_mpi_fort 8 (family=mvapich2)
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_0]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[cli_3]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_1]: [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0
Test case:[perf-tools/Score-P] Serial C OpenMP binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.433 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial C++ OpenMP binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.392 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial Fortran OpenMP binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:0.413 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:5.514 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 147)
  `run_mpi_binary ./mpi/main_hyb_c $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.17441
Batch job 489 submitted

Job 489 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./mpi/main_hyb_c 8 (family=mvapich2)
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_1]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_0]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_3]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C++ binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:5.529 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 164)
  `run_mpi_binary ./mpi/main_hyb_cxx $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.10525
Batch job 490 submitted

Job 490 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./mpi/main_hyb_cxx 8 (family=mvapich2)
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_4]: [create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_5]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) Fortran binary runs under resource manager (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:5.517 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 181)
  `run_mpi_binary ./mpi/main_hyb_fort $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.5521
Batch job 491 submitted

Job 491 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./mpi/main_hyb_fort 8 (family=mvapich2)
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[cli_0]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_1]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_2]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_3]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_4]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_5]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_6]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_7]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

Test Suite: test_module

Results

Duration0.734 sec
Tests14
Failures0

Tests

test_module

Test case:[perf-tools/Score-P] Verify scorep module is loaded and matches rpm version (gnu14/mvapich2)
Outcome:Passed
Duration:0.192 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify module SCOREP_DIR is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify module SCOREP_BIN is defined and exists (gnu14/mvapich2)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-backend-info binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-config binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-info binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-preload-init binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-score binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-wrapper binary (gnu14/mvapich2)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep wrapper binaries (gnu14/mvapich2)
Outcome:Passed
Duration:0.156 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of user guide for scorep (gnu14/mvapich2)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of OPEN_ISSUES for scorep (gnu14/mvapich2)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of COPYING for scorep (gnu14/mvapich2)
Outcome:Passed
Duration:0.033 sec
FailedNone
None

Test Suite: test_harness

Results

Duration3.432 sec
Tests3
Failures0

Tests

test_harness

Test case:[RMS/harness] Verify zero exit code from MPI job runs OK (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.26 sec
FailedNone
None
Test case:[RMS/harness] Verify non-zero exit code from MPI job detected as failure (slurm/gnu14/mpich)
Outcome:Passed
Duration:1.172 sec
FailedNone
None
Test case:[RMS/harness] Verify long-running MPI job terminates with timeout parameter (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
disabled timeout test

Test Suite: test_harness

Results

Duration7.668 sec
Tests3
Failures0

Tests

test_harness

Test case:[RMS/harness] Verify zero exit code from MPI job runs OK (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.335 sec
FailedNone
None
Test case:[RMS/harness] Verify non-zero exit code from MPI job detected as failure (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:4.333 sec
FailedNone
None
Test case:[RMS/harness] Verify long-running MPI job terminates with timeout parameter (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
disabled timeout test

Test Suite: test_harness

Results

Duration9.76 sec
Tests3
Failures1

Tests

test_harness

Test case:[RMS/harness] Verify zero exit code from MPI job runs OK (slurm/gnu14/mvapich2)
Outcome:Failed
Duration:5.456 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file test_harness, line 23)
  `run_mpi_binary ./mpi_exit 0 $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.20384
Batch job 508 submitted

Job 508 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./mpi_exit 0 (family=mvapich2)
[c2:mpi_rank_1][rdma_find_network_type] Unable to find the numa process is bound to. Disabling process placement aware hca mapping.
[c1:mpi_rank_0][rdma_find_network_type] Unable to find the numa process is bound to. Disabling process placement aware hca mapping.
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
[create_qp:2752]create qp: failed on ibv_cmd_create_qp with 12
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0

[cli_0]: aborting job:
Fatal error in MPI_Init:
Other MPI error, error stack:
MPIR_Init_thread(493)....:
MPID_Init(419)...........: channel initialization failed
MPIDI_CH3_Init(601)......:
MPIDI_CH3I_RDMA_init(446):
rdma_iba_hca_init(2003)..: Failed to create qp for rank 0
Test case:[RMS/harness] Verify non-zero exit code from MPI job detected as failure (slurm/gnu14/mvapich2)
Outcome:Passed
Duration:4.304 sec
FailedNone
None
Test case:[RMS/harness] Verify long-running MPI job terminates with timeout parameter (slurm/gnu14/mvapich2)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
disabled timeout test

Test Suite: run

Results

Duration4.183 sec
Tests5
Failures0

Tests

run

Test case:[charliecloud] check for RPM
Outcome:Passed
Duration:0.292 sec
FailedNone
None
Test case:[charliecloud] build image
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skip experimental charliecloud image builder
Test case:[charliecloud] build alpine image from Docker (using singularity)
Outcome:Passed
Duration:3.101 sec
FailedNone
None
Test case:[charliecloud] exec image locally
Outcome:Passed
Duration:0.3 sec
FailedNone
None
Test case:[charliecloud] exec image via slurm
Outcome:Passed
Duration:0.49 sec
FailedNone
None

Test Suite: run

Results

Duration9.386 sec
Tests4
Failures0

Tests

run

Test case:[singularity] check for RPM
Outcome:Passed
Duration:0.031 sec
FailedNone
None
Test case:[singularity] pull down ubuntu docker image
Outcome:Passed
Duration:8.588 sec
FailedNone
None
Test case:[singularity] exec image
Outcome:Passed
Duration:0.321 sec
FailedNone
None
Test case:[singularity] exec image via slurm
Outcome:Passed
Duration:0.446 sec
FailedNone
None

Test Suite: completion

Results

Duration0.085 sec
Tests1
Failures0

Tests

completion

Test case:[singularity] check for bash completion
Outcome:Passed
Duration:0.085 sec
FailedNone
None

Test Suite: ntp

Results

Duration0.404 sec
Tests3
Failures0

Tests

ntp

Test case:[ntp] check for chronyc binary
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[ntp] verify local time in sync on SMS
Outcome:Passed
Duration:0.148 sec
FailedNone
None
Test case:[ntp] verify local time in sync on compute
Outcome:Passed
Duration:0.222 sec
FailedNone
None

Test Suite: mem_limits

Results

Duration0.404 sec
Tests2
Failures0

Tests

mem_limits

Test case:[memlock] check increased soft limit
Outcome:Passed
Duration:0.205 sec
FailedNone
None
Test case:[memlock] check increased hard limit
Outcome:Passed
Duration:0.199 sec
FailedNone
None

Test Suite: pdsh

Results

Duration0.545 sec
Tests4
Failures0

Tests

pdsh

Test case:[pdsh] check for RPM
Outcome:Passed
Duration:0.05 sec
FailedNone
None
Test case:[pdsh] run a shell command on c[1-4]
Outcome:Passed
Duration:0.219 sec
FailedNone
None
Test case:[pdsh] check for pdsh-mod-slurm RPM
Outcome:Passed
Duration:0.052 sec
FailedNone
None
Test case:[pdsh] run a shell command on -P normal
Outcome:Passed
Duration:0.224 sec
FailedNone
None

Test Suite: ompi_info

Results

Duration0.654 sec
Tests1
Failures0

Tests

ompi_info

Test case:[openmpi] check for no output to stderr with ompi_info
Outcome:Passed
Duration:0.654 sec
FailedNone
None

Test Suite: magpie

Results

Duration0.925 sec
Tests3
Failures0

Tests

magpie

Test case:[magpie] check for RPM
Outcome:Passed
Duration:0.052 sec
FailedNone
None
Test case:[magpie] Verify MAGPIE module is loaded and matches rpm version
Outcome:Passed
Duration:0.529 sec
FailedNone
None
Test case:[magpie] Verify module MAGPIE_DIR is defined and exists
Outcome:Passed
Duration:0.344 sec
FailedNone
None

Test Suite: munge

Results

Duration1.141 sec
Tests4
Failures0

Tests

munge

Test case:[munge] check for OS provdied RPM
Outcome:Passed
Duration:0.05 sec
FailedNone
None
Test case:[munge] Generate a credential
Outcome:Passed
Duration:0.032 sec
FailedNone
None
Test case:[munge] Decode credential locally
Outcome:Passed
Duration:0.026 sec
FailedNone
None
Test case:[munge] Run benchmark
Outcome:Passed
Duration:1.033 sec
FailedNone
None

Test Suite: sacct

Results

Duration0.0 sec
Tests1
Failures0

Tests

sacct

Test case:[slurm] check for working sacct
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
Temporarily skip sacct check

Test Suite: sinfo

Results

Duration0.087 sec
Tests1
Failures0

Tests

sinfo

Test case:[slurm] Verify SLURM RPM version matches sinfo binary
Outcome:Passed
Duration:0.087 sec
FailedNone
None

Test Suite: oom

Results

Duration0.0 sec
Tests1
Failures0

Tests

oom

Test case:[oom] Test job OOM condition (gnu14/slurm)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
disable to invistigate disk fillup