Duration | 0.541 sec |
---|---|
Tests | 5 |
Failures | 0 |
Duration | 1.525 sec |
---|---|
Tests | 4 |
Failures | 2 |
Test case: | [modules] env variable passes through () |
---|---|
Outcome: | Passed |
Duration: | 0.376 sec |
Failed | None |
None
Test case: | [modules] loaded module passes through () |
---|---|
Outcome: | Failed |
Duration: | 0.449 sec |
Failed | None |
(from function `assert_success' in file ../common/test_helper_functions.bash, line 58, in test file rm_execution, line 36) `assert_success' failed -- command failed -- status : 1 output : srun: error: c1: task 0: Exited with exit code 1 --
Test case: | [modules] module commands available in RMS job () |
---|---|
Outcome: | Failed |
Duration: | 0.33 sec |
Failed | None |
(from function `assert_success' in file ../common/test_helper_functions.bash, line 58, in test file rm_execution, line 44) `assert_success' failed -- command failed -- status : 1 output (2 lines): srun: error: c1: task 0: Exited with exit code 1 environment: line 17: /opt/ohpc/admin/lmod/lmod/libexec/lmod: No such file or directory --
Duration | 4.181 sec |
---|---|
Tests | 8 |
Failures | 0 |
Duration | 3.236 sec |
---|---|
Tests | 2 |
Failures | 2 |
Test case: | [libs/ADIOS2] MPI C binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 2.133 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 24) `run_mpi_binary ./${binary} "" $NODES $TASKS ' failed job script = /tmp/job.ohpc-test.2397 Batch job 31 submitted Job 31 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./arrays_write (family=openmpi5)
Test case: | [libs/ADIOS2] MPI F90 binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 1.103 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 34) `run_mpi_binary ./${binary} "" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.26150 Batch job 32 submitted Job 32 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./scalars_write (family=openmpi5)
Duration | 0.184 sec |
---|---|
Tests | 7 |
Failures | 0 |
Test case: | [libs/ADIOS2] Verify ADIOS2 module is loaded and matches rpm version (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.125 sec |
Failed | None |
None
Test case: | [libs/ADIOS2] Verify module ADIOS2_DIR is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/ADIOS2] Verify module ADIOS2_BIN is defined and exists |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/ADIOS2] Verify module ADIOS2_LIB is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/ADIOS2] Verify (dynamic) library available in ADIOS2_LIB (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.012 sec |
Failed | None |
None
Duration | 295.251 sec |
---|---|
Tests | 2 |
Failures | 2 |
Test case: | [libs/ADIOS2] MPI C binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 145.044 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 24) `run_mpi_binary ./${binary} "" $NODES $TASKS ' failed job script = /tmp/job.ohpc-test.140 Batch job 29 submitted Job 29 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./arrays_write (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 29 ON c1 CANCELLED AT 2024-10-21T19:03:02 DUE TO TIME LIMIT ***
Test case: | [libs/ADIOS2] MPI F90 binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 150.207 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 34) `run_mpi_binary ./${binary} "" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.14151 Batch job 30 submitted Job 30 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./scalars_write (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 30 ON c1 CANCELLED AT 2024-10-21T19:05:33 DUE TO TIME LIMIT ***
Duration | 0.178 sec |
---|---|
Tests | 7 |
Failures | 0 |
Test case: | [libs/ADIOS2] Verify ADIOS2 module is loaded and matches rpm version (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.12 sec |
Failed | None |
None
Test case: | [libs/ADIOS2] Verify module ADIOS2_DIR is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/ADIOS2] Verify module ADIOS2_BIN is defined and exists |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/ADIOS2] Verify module ADIOS2_LIB is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/ADIOS2] Verify (dynamic) library available in ADIOS2_LIB (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.011 sec |
Failed | None |
None
Duration | 4.125 sec |
---|---|
Tests | 11 |
Failures | 0 |
Test case: | [libs/openblas/dblat1] dblat1 under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.154 sec |
Failed | None |
None
Test case: | [libs/openblas/xccblat1] xccblat1 under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.211 sec |
Failed | None |
None
Test case: | [libs/openblas/xzcblat1] xzcblat1 under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.175 sec |
Failed | None |
None
Test case: | [libs/openblas/xscblat2] xscblat2 under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.43 sec |
Failed | None |
None
Test case: | [libs/openblas/xdcblat2] xdcblat2 under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.407 sec |
Failed | None |
None
Test case: | [libs/openblas/xccblat2] xccblat2 under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.581 sec |
Failed | None |
None
Test case: | [libs/openblas/xzcblat2] xzcblat2 under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.58 sec |
Failed | None |
None
Test case: | [libs/openblas/xscblat3] xscblat3 under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.225 sec |
Failed | None |
None
Test case: | [libs/openblas/xdcblat3] xdcblat3 under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.265 sec |
Failed | None |
None
Duration | 0.134 sec |
---|---|
Tests | 5 |
Failures | 0 |
Test case: | [libs/OpenBLAS] Verify OPENBLAS module is loaded and matches rpm version (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.098 sec |
Failed | None |
None
Test case: | [libs/OpenBLAS] Verify module OPENBLAS_DIR is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/OpenBLAS] Verify module OPENBLAS_LIB is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.008 sec |
Failed | None |
None
Duration | 3.242 sec |
---|---|
Tests | 2 |
Failures | 2 |
Test case: | [libs/NetCDF] C parallel I/O (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 1.106 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file test_pnetcdf, line 28) `run_mpi_binary ./C_parallel "atest" 2 4' failed job script = /tmp/job.ohpc-test.11850 Batch job 205 submitted Job 205 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./C_parallel atest (family=openmpi5)
Test case: | [libs/NetCDF] Fortran parallel I/O (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 2.136 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file test_pnetcdf, line 64) `run_mpi_binary -t "00:02:00" ./F90_parallel "atest" 2 4' failed job script = /tmp/job.ohpc-test.10674 Batch job 206 submitted Job 206 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./F90_parallel atest (family=openmpi5)
Duration | 298.538 sec |
---|---|
Tests | 2 |
Failures | 2 |
Test case: | [libs/NetCDF] C parallel I/O (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 149.225 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file test_pnetcdf, line 28) `run_mpi_binary ./C_parallel "atest" 2 4' failed job script = /tmp/job.ohpc-test.428 Batch job 200 submitted Job 200 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./C_parallel atest (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 200 ON c1 CANCELLED AT 2024-10-21T20:42:33 DUE TO TIME LIMIT ***
Test case: | [libs/NetCDF] Fortran parallel I/O (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 149.313 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file test_pnetcdf, line 64) `run_mpi_binary -t "00:02:00" ./F90_parallel "atest" 2 4' failed job script = /tmp/job.ohpc-test.26604 Batch job 201 submitted Job 201 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./F90_parallel atest (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 201 ON c1 CANCELLED AT 2024-10-21T20:45:02 DUE TO TIME LIMIT ***
Duration | 0.21 sec |
---|---|
Tests | 9 |
Failures | 0 |
Test case: | [libs/NetCDF-Fortran] Verify NETCDF_FORTRAN module is loaded and matches rpm version (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.12 sec |
Failed | None |
None
Test case: | [libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_DIR is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_BIN is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.011 sec |
Failed | None |
None
Test case: | [libs/NetCDF-Fortran] Verify availability of nf-config binary (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.023 sec |
Failed | None |
None
Test case: | [libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_LIB is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/NetCDF-Fortran] Verify dynamic library available in NETCDF_FORTRAN_LIB (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/NetCDF-Fortran] Verify static library is not present in NETCDF_FORTRAN_LIB (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Duration | 0.21 sec |
---|---|
Tests | 9 |
Failures | 0 |
Test case: | [libs/NetCDF-CXX] Verify NETCDF_CXX module is loaded and matches rpm version (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.12 sec |
Failed | None |
None
Test case: | [libs/NetCDF-CXX] Verify module NETCDF_CXX_DIR is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.011 sec |
Failed | None |
None
Test case: | [libs/NetCDF-CXX] Verify module NETCDF_CXX_BIN is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.011 sec |
Failed | None |
None
Test case: | [libs/NetCDF-CXX] Verify availability of ncxx4-config binary (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.022 sec |
Failed | None |
None
Test case: | [libs/NetCDF-CXX] Verify module NETCDF_CXX_LIB is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/NetCDF-CXX] Verify dynamic library available in NETCDF_CXX_LIB (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/NetCDF-CXX] Verify static library is not present in NETCDF_CXX_LIB (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 0.859 sec |
---|---|
Tests | 7 |
Failures | 3 |
Test case: | [libs/NetCDF] ncdump availability (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.078 sec |
Failed | None |
None
Test case: | [libs/NetCDF] verify nc4/hdf5 available for C interface (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.027 sec |
Failed | None |
None
Test case: | [libs/NetCDF] verify nc4 available for Fortran interface (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.018 sec |
Failed | None |
None
Test case: | [libs/NetCDF] verify nc4 available for C++ interface (gnu14/openmpi5) |
---|---|
Outcome: | Skipped |
Duration: | 0.0 sec |
Failed | None |
Skipped | None |
None
option no longer supported
Test case: | [libs/NetCDF] C write/read (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 0.286 sec |
Failed | None |
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58, in test file test_netcdf, line 64) `assert_success' failed -- command failed -- status : 127 output (2 lines): /home/ohpc-test/tests/libs/netcdf/tests/./C_write: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory srun: error: c1: task 0: Exited with exit code 127 --
Test case: | [libs/NetCDF] Fortran write/read (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 0.199 sec |
Failed | None |
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58, in test file test_netcdf, line 105) `assert_success' failed -- command failed -- status : 127 output (2 lines): /home/ohpc-test/tests/libs/netcdf/tests/./F90_write: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory srun: error: c1: task 0: Exited with exit code 127 --
Test case: | [libs/NetCDF] C++ write/read (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 0.251 sec |
Failed | None |
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58, in test file test_netcdf, line 147) `assert_success' failed -- command failed -- status : 127 output (2 lines): /home/ohpc-test/tests/libs/netcdf/tests/./CXX_write: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory srun: error: c1: task 0: Exited with exit code 127 --
Duration | 0.205 sec |
---|---|
Tests | 9 |
Failures | 0 |
Test case: | [libs/NetCDF] Verify NETCDF module is loaded and matches rpm version (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.117 sec |
Failed | None |
None
Test case: | [libs/NetCDF] Verify module NETCDF_DIR is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/NetCDF] Verify module NETCDF_BIN is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/NetCDF] Verify availability of nc-config binary (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.023 sec |
Failed | None |
None
Test case: | [libs/NetCDF] Verify module NETCDF_LIB is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/NetCDF] Verify dynamic library available in NETCDF_LIB (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/NetCDF] Verify static library is not present in NETCDF_LIB (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 0.207 sec |
---|---|
Tests | 9 |
Failures | 0 |
Test case: | [libs/NetCDF-Fortran] Verify NETCDF_FORTRAN module is loaded and matches rpm version (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.12 sec |
Failed | None |
None
Test case: | [libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_DIR is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_BIN is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/NetCDF-Fortran] Verify availability of nf-config binary (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.023 sec |
Failed | None |
None
Test case: | [libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_LIB is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/NetCDF-Fortran] Verify dynamic library available in NETCDF_FORTRAN_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/NetCDF-Fortran] Verify static library is not present in NETCDF_FORTRAN_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 0.201 sec |
---|---|
Tests | 9 |
Failures | 0 |
Test case: | [libs/NetCDF-CXX] Verify NETCDF_CXX module is loaded and matches rpm version (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.114 sec |
Failed | None |
None
Test case: | [libs/NetCDF-CXX] Verify module NETCDF_CXX_DIR is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/NetCDF-CXX] Verify module NETCDF_CXX_BIN is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/NetCDF-CXX] Verify availability of ncxx4-config binary (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.023 sec |
Failed | None |
None
Test case: | [libs/NetCDF-CXX] Verify module NETCDF_CXX_LIB is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/NetCDF-CXX] Verify dynamic library available in NETCDF_CXX_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.008 sec |
Failed | None |
None
Test case: | [libs/NetCDF-CXX] Verify static library is not present in NETCDF_CXX_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 0.779 sec |
---|---|
Tests | 7 |
Failures | 3 |
Test case: | [libs/NetCDF] ncdump availability (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.038 sec |
Failed | None |
None
Test case: | [libs/NetCDF] verify nc4/hdf5 available for C interface (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.026 sec |
Failed | None |
None
Test case: | [libs/NetCDF] verify nc4 available for Fortran interface (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.018 sec |
Failed | None |
None
Test case: | [libs/NetCDF] verify nc4 available for C++ interface (gnu14/mpich) |
---|---|
Outcome: | Skipped |
Duration: | 0.0 sec |
Failed | None |
Skipped | None |
None
option no longer supported
Test case: | [libs/NetCDF] C write/read (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 0.237 sec |
Failed | None |
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58, in test file test_netcdf, line 64) `assert_success' failed -- command failed -- status : 127 output (2 lines): /home/ohpc-test/tests/libs/netcdf/tests/./C_write: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory srun: error: c1: task 0: Exited with exit code 127 --
Test case: | [libs/NetCDF] Fortran write/read (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 0.222 sec |
Failed | None |
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58, in test file test_netcdf, line 105) `assert_success' failed -- command failed -- status : 127 output (2 lines): /home/ohpc-test/tests/libs/netcdf/tests/./F90_write: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory srun: error: c1: task 0: Exited with exit code 127 --
Test case: | [libs/NetCDF] C++ write/read (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 0.238 sec |
Failed | None |
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58, in test file test_netcdf, line 147) `assert_success' failed -- command failed -- status : 127 output (2 lines): /home/ohpc-test/tests/libs/netcdf/tests/./CXX_write: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory srun: error: c1: task 0: Exited with exit code 127 --
Duration | 0.199 sec |
---|---|
Tests | 9 |
Failures | 0 |
Test case: | [libs/NetCDF] Verify NETCDF module is loaded and matches rpm version (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.111 sec |
Failed | None |
None
Test case: | [libs/NetCDF] Verify module NETCDF_DIR is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/NetCDF] Verify module NETCDF_BIN is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/NetCDF] Verify availability of nc-config binary (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.023 sec |
Failed | None |
None
Test case: | [libs/NetCDF] Verify module NETCDF_LIB is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/NetCDF] Verify dynamic library available in NETCDF_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/NetCDF] Verify static library is not present in NETCDF_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 0.193 sec |
---|---|
Tests | 9 |
Failures | 0 |
Test case: | [libs/NetCDF-Fortran] Verify NETCDF_FORTRAN module is loaded and matches rpm version (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.107 sec |
Failed | None |
None
Test case: | [libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_DIR is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_BIN is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/NetCDF-Fortran] Verify availability of nf-config binary (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.022 sec |
Failed | None |
None
Test case: | [libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_LIB is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.008 sec |
Failed | None |
None
Test case: | [libs/NetCDF-Fortran] Verify dynamic library available in NETCDF_FORTRAN_LIB (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/NetCDF-Fortran] Verify static library is not present in NETCDF_FORTRAN_LIB (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 0.195 sec |
---|---|
Tests | 9 |
Failures | 0 |
Test case: | [libs/NetCDF-CXX] Verify NETCDF_CXX module is loaded and matches rpm version (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.108 sec |
Failed | None |
None
Test case: | [libs/NetCDF-CXX] Verify module NETCDF_CXX_DIR is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/NetCDF-CXX] Verify module NETCDF_CXX_BIN is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/NetCDF-CXX] Verify availability of ncxx4-config binary (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.022 sec |
Failed | None |
None
Test case: | [libs/NetCDF-CXX] Verify module NETCDF_CXX_LIB is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/NetCDF-CXX] Verify dynamic library available in NETCDF_CXX_LIB (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/NetCDF-CXX] Verify static library is not present in NETCDF_CXX_LIB (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 1.572 sec |
---|---|
Tests | 7 |
Failures | 0 |
Test case: | [libs/NetCDF] ncdump availability (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.033 sec |
Failed | None |
None
Test case: | [libs/NetCDF] verify nc4/hdf5 available for C interface (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.026 sec |
Failed | None |
None
Test case: | [libs/NetCDF] verify nc4 available for Fortran interface (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.017 sec |
Failed | None |
None
Test case: | [libs/NetCDF] verify nc4 available for C++ interface (gnu14) |
---|---|
Outcome: | Skipped |
Duration: | 0.0 sec |
Failed | None |
Skipped | None |
None
option no longer supported
Test case: | [libs/NetCDF] C write/read (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.459 sec |
Failed | None |
None
Duration | 0.191 sec |
---|---|
Tests | 9 |
Failures | 0 |
Test case: | [libs/NetCDF] Verify NETCDF module is loaded and matches rpm version (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.105 sec |
Failed | None |
None
Test case: | [libs/NetCDF] Verify module NETCDF_DIR is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/NetCDF] Verify module NETCDF_BIN is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/NetCDF] Verify availability of nc-config binary (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.023 sec |
Failed | None |
None
Test case: | [libs/NetCDF] Verify module NETCDF_LIB is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.008 sec |
Failed | None |
None
Test case: | [libs/NetCDF] Verify dynamic library available in NETCDF_LIB (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/NetCDF] Verify static library is not present in NETCDF_LIB (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.008 sec |
Failed | None |
None
Duration | 0.185 sec |
---|---|
Tests | 9 |
Failures | 0 |
Test case: | [libs/Metis] Verify METIS module is loaded and matches rpm version (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.099 sec |
Failed | None |
None
Test case: | [libs/Metis] Verify module METIS_DIR is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/Metis] Verify module METIS_BIN is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/Metis] Verify availability of m2gmetis binary (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.021 sec |
Failed | None |
None
Test case: | [libs/Metis] Verify module METIS_LIB is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/Metis] Verify dynamic library available in METIS_LIB (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/Metis] Verify static library is not present in METIS_LIB (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 4.111 sec |
---|---|
Tests | 4 |
Failures | 0 |
Duration | 2.175 sec |
---|---|
Tests | 2 |
Failures | 1 |
Test case: | [libs/OpenCoarrays] hello_multiverse binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 2.175 sec |
Failed | None |
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58, in test file rm_execution, line 39) `assert_success' failed -- command failed -- status : 1 output (9 lines): job script = /tmp/job.ohpc-test.28415 Batch job 225 submitted Job 225 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./hello (family=openmpi5) --
Duration | 0.183 sec |
---|---|
Tests | 9 |
Failures | 0 |
Test case: | [libs/opencoarrays] Verify opencoarrays module is loaded and matches rpm version (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.11 sec |
Failed | None |
None
Test case: | [libs/opencoarrays] Verify module OPENCOARRAYS_DIR is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/opencoarrays] Verify module OPENCOARRAYS_LIB is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/opencoarrays] Verify dynamic library available in OPENCOARRAYS_LIB (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/opencoarrays] Verify static library is not present in OPENCOARRAYS_LIB (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/opencoarrays] Verify module OPENCOARRAYS_INC is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/opencoarrays] Verify header file is present in OPENCOARRAYS_INC (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 122.67 sec |
---|---|
Tests | 2 |
Failures | 1 |
Test case: | [libs/OpenCoarrays] hello_multiverse binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 122.67 sec |
Failed | None |
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58, in test file rm_execution, line 39) `assert_success' failed -- command failed -- status : 1 output (12 lines): job script = /tmp/job.ohpc-test.13692 Batch job 224 submitted Job 224 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./hello (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 224 ON c1 CANCELLED AT 2024-10-21T20:48:33 DUE TO TIME LIMIT *** --
Duration | 0.179 sec |
---|---|
Tests | 9 |
Failures | 0 |
Test case: | [libs/opencoarrays] Verify opencoarrays module is loaded and matches rpm version (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.106 sec |
Failed | None |
None
Test case: | [libs/opencoarrays] Verify module OPENCOARRAYS_DIR is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/opencoarrays] Verify module OPENCOARRAYS_LIB is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/opencoarrays] Verify dynamic library available in OPENCOARRAYS_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/opencoarrays] Verify static library is not present in OPENCOARRAYS_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/opencoarrays] Verify module OPENCOARRAYS_INC is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/opencoarrays] Verify header file is present in OPENCOARRAYS_INC (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 12.72 sec |
---|---|
Tests | 5 |
Failures | 5 |
Test case: | [libs/Mumps] C (double precision) runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 1.103 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 26) `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed job script = /tmp/job.ohpc-test.23020 Batch job 186 submitted Job 186 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./C_double null (family=openmpi5)
Test case: | [libs/Mumps] Fortran (single precision) runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 3.161 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 36) `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed job script = /tmp/job.ohpc-test.11024 Batch job 187 submitted Job 187 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./F_single null (family=openmpi5)
Test case: | [libs/Mumps] Fortran (double precision) runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 3.162 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 46) `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed job script = /tmp/job.ohpc-test.26931 Batch job 188 submitted Job 188 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./F_double null (family=openmpi5)
Test case: | [libs/Mumps] Fortran (complex) runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 3.161 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 56) `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed job script = /tmp/job.ohpc-test.32198 Batch job 189 submitted Job 189 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./F_complex null (family=openmpi5)
Test case: | [libs/Mumps] Fortran (double complex) runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 2.133 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 66) `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed job script = /tmp/job.ohpc-test.26352 Batch job 190 submitted Job 190 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./F_doublecomplex null (family=openmpi5)
Duration | 746.686 sec |
---|---|
Tests | 5 |
Failures | 5 |
Test case: | [libs/Mumps] C (double precision) runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 147.071 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 26) `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed job script = /tmp/job.ohpc-test.11212 Batch job 181 submitted Job 181 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./C_double null (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 181 ON c1 CANCELLED AT 2024-10-21T20:26:33 DUE TO TIME LIMIT ***
Test case: | [libs/Mumps] Fortran (single precision) runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 150.169 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 36) `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed job script = /tmp/job.ohpc-test.1579 Batch job 182 submitted Job 182 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./F_single null (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 182 ON c1 CANCELLED AT 2024-10-21T20:29:03 DUE TO TIME LIMIT ***
Test case: | [libs/Mumps] Fortran (double precision) runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 148.108 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 46) `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed job script = /tmp/job.ohpc-test.4781 Batch job 183 submitted Job 183 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./F_double null (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 183 ON c1 CANCELLED AT 2024-10-21T20:31:32 DUE TO TIME LIMIT ***
Test case: | [libs/Mumps] Fortran (complex) runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 150.167 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 56) `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed job script = /tmp/job.ohpc-test.9503 Batch job 184 submitted Job 184 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./F_complex null (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 184 ON c1 CANCELLED AT 2024-10-21T20:34:02 DUE TO TIME LIMIT ***
Test case: | [libs/Mumps] Fortran (double complex) runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 151.171 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 66) `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed job script = /tmp/job.ohpc-test.5311 Batch job 185 submitted Job 185 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./F_doublecomplex null (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 185 ON c1 CANCELLED AT 2024-10-21T20:36:33 DUE TO TIME LIMIT ***
Duration | 1.594 sec |
---|---|
Tests | 3 |
Failures | 3 |
Test case: | [libs/FFTW] Serial C binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 0.231 sec |
Failed | None |
(from function `assert_success' in file ../../../common/test_helper_functions.bash, line 58, in test file rm_execution, line 26) `assert_success' failed -- command failed -- status : 127 output (2 lines): /home/ohpc-test/tests/libs/fftw/tests/./C_test: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory srun: error: c1: task 0: Exited with exit code 127 --
Test case: | [libs/FFTW] MPI C binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 1.142 sec |
Failed | None |
(from function `assert_success' in file ../../../common/test_helper_functions.bash, line 58, in test file rm_execution, line 40) `assert_success' failed -- command failed -- status : 1 output (9 lines): job script = /tmp/job.ohpc-test.3073 Batch job 101 submitted Job 101 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./C_mpi_test 8 (family=openmpi5) --
Test case: | [libs/FFTW] Serial Fortran binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 0.221 sec |
Failed | None |
(from function `assert_success' in file ../../../common/test_helper_functions.bash, line 58, in test file rm_execution, line 49) `assert_success' failed -- command failed -- status : 127 output (2 lines): /home/ohpc-test/tests/libs/fftw/tests/./F_test: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory srun: error: c1: task 0: Exited with exit code 127 --
Duration | 0.162 sec |
---|---|
Tests | 7 |
Failures | 0 |
Test case: | [FFTW] Verify FFTW module is loaded and matches rpm version (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.108 sec |
Failed | None |
None
Test case: | [FFTW] Verify module FFTW_DIR is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [FFTW] Verify module FFTW_LIB is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [FFTW] Verify dynamic library available in FFTW_LIB (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [FFTW] Verify static library is not present in FFTW_LIB (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.008 sec |
Failed | None |
None
Duration | 0.159 sec |
---|---|
Tests | 7 |
Failures | 0 |
Test case: | [FFTW] Verify FFTW module is loaded and matches rpm version (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.105 sec |
Failed | None |
None
Test case: | [FFTW] Verify module FFTW_DIR is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [FFTW] Verify module FFTW_LIB is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [FFTW] Verify dynamic library available in FFTW_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.008 sec |
Failed | None |
None
Test case: | [FFTW] Verify static library is not present in FFTW_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 324.363 sec |
---|---|
Tests | 3 |
Failures | 3 |
Test case: | [libs/FFTW] Serial C binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 0.205 sec |
Failed | None |
(from function `assert_success' in file ../../../common/test_helper_functions.bash, line 58, in test file rm_execution, line 26) `assert_success' failed -- command failed -- status : 127 output (2 lines): /home/ohpc-test/tests/libs/fftw/tests/./C_test: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory srun: error: c1: task 0: Exited with exit code 127 --
Test case: | [libs/FFTW] MPI C binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 323.918 sec |
Failed | None |
(from function `assert_success' in file ../../../common/test_helper_functions.bash, line 58, in test file rm_execution, line 40) `assert_success' failed -- command failed -- status : 1 output (12 lines): job script = /tmp/job.ohpc-test.25346 Batch job 98 submitted Job 98 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./C_mpi_test 8 (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 98 ON c1 CANCELLED AT 2024-10-21T19:38:32 DUE TO TIME LIMIT *** --
Test case: | [libs/FFTW] Serial Fortran binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 0.24 sec |
Failed | None |
(from function `assert_success' in file ../../../common/test_helper_functions.bash, line 58, in test file rm_execution, line 49) `assert_success' failed -- command failed -- status : 127 output (2 lines): /home/ohpc-test/tests/libs/fftw/tests/./F_test: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory srun: error: c1: task 0: Exited with exit code 127 --
Duration | 15.852 sec |
---|---|
Tests | 6 |
Failures | 6 |
Test case: | [Boost/MPI] all_gather_test under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 1.1 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 21) `run_mpi_binary ./$test atest 2 16' failed job script = /tmp/job.ohpc-test.27201 Batch job 91 submitted Job 91 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./all_gather_test atest (family=openmpi5)
Test case: | [Boost/MPI] all_reduce_test under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 3.156 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 30) `run_mpi_binary ./$test atest 2 16' failed job script = /tmp/job.ohpc-test.26579 Batch job 92 submitted Job 92 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./all_reduce_test atest (family=openmpi5)
Test case: | [Boost/MPI] all_to_all_test under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 2.129 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 39) `run_mpi_binary ./$test atest 2 16' failed job script = /tmp/job.ohpc-test.2624 Batch job 93 submitted Job 93 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./all_to_all_test atest (family=openmpi5)
Test case: | [Boost/MPI] groups_test under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 3.155 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 48) `run_mpi_binary ./$test atest 2 16' failed job script = /tmp/job.ohpc-test.7081 Batch job 94 submitted Job 94 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./groups_test atest (family=openmpi5)
Test case: | [Boost/MPI] ring_test under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 3.156 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 66) `run_mpi_binary ./$test atest 2 16' failed job script = /tmp/job.ohpc-test.3988 Batch job 95 submitted Job 95 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./ring_test atest (family=openmpi5)
Test case: | [Boost/MPI] pointer_test under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 3.156 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 75) `run_mpi_binary ./$test atest 2 16' failed job script = /tmp/job.ohpc-test.31550 Batch job 96 submitted Job 96 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./pointer_test atest (family=openmpi5)
Duration | 898.074 sec |
---|---|
Tests | 6 |
Failures | 6 |
Test case: | [Boost/MPI] all_gather_test under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 147.97 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 21) `run_mpi_binary ./$test atest 2 16' failed job script = /tmp/job.ohpc-test.29089 Batch job 85 submitted Job 85 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./all_gather_test atest (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 85 ON c1 CANCELLED AT 2024-10-21T19:19:02 DUE TO TIME LIMIT ***
Test case: | [Boost/MPI] all_reduce_test under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 150.008 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 30) `run_mpi_binary ./$test atest 2 16' failed job script = /tmp/job.ohpc-test.21518 Batch job 86 submitted Job 86 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./all_reduce_test atest (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 86 ON c1 CANCELLED AT 2024-10-21T19:21:32 DUE TO TIME LIMIT ***
Test case: | [Boost/MPI] all_to_all_test under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 151.1 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 39) `run_mpi_binary ./$test atest 2 16' failed job script = /tmp/job.ohpc-test.11032 Batch job 87 submitted Job 87 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./all_to_all_test atest (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 87 ON c1 CANCELLED AT 2024-10-21T19:24:03 DUE TO TIME LIMIT ***
Test case: | [Boost/MPI] groups_test under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 149.022 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 48) `run_mpi_binary ./$test atest 2 16' failed job script = /tmp/job.ohpc-test.30766 Batch job 88 submitted Job 88 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./groups_test atest (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 88 ON c1 CANCELLED AT 2024-10-21T19:26:32 DUE TO TIME LIMIT ***
Test case: | [Boost/MPI] ring_test under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 149.988 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 66) `run_mpi_binary ./$test atest 2 16' failed job script = /tmp/job.ohpc-test.26987 Batch job 89 submitted Job 89 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ring_test atest (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 89 ON c1 CANCELLED AT 2024-10-21T19:29:02 DUE TO TIME LIMIT ***
Test case: | [Boost/MPI] pointer_test under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 149.986 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 75) `run_mpi_binary ./$test atest 2 16' failed job script = /tmp/job.ohpc-test.32694 Batch job 90 submitted Job 90 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./pointer_test atest (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 90 ON c1 CANCELLED AT 2024-10-21T19:31:32 DUE TO TIME LIMIT ***
Duration | 0.182 sec |
---|---|
Tests | 7 |
Failures | 0 |
Test case: | [BOOST] Verify BOOST module is loaded and matches rpm version (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.127 sec |
Failed | None |
None
Test case: | [BOOST] Verify module BOOST_DIR is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [BOOST] Verify module BOOST_LIB is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [BOOST] Verify dynamic library available in BOOST_LIB (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [BOOST] Verify static library is not present in BOOST_LIB (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 0.181 sec |
---|---|
Tests | 7 |
Failures | 0 |
Test case: | [BOOST] Verify BOOST module is loaded and matches rpm version (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.125 sec |
Failed | None |
None
Test case: | [BOOST] Verify module BOOST_DIR is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.011 sec |
Failed | None |
None
Test case: | [BOOST] Verify module BOOST_LIB is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [BOOST] Verify dynamic library available in BOOST_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [BOOST] Verify static library is not present in BOOST_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 26.416 sec |
---|---|
Tests | 12 |
Failures | 9 |
Test case: | [libs/HYPRE] 2 PE structured test binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 2.134 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 23) `run_mpi_binary ./ex1 $ARGS $NODES $TASKS' failed job script = /tmp/job.ohpc-test.5830 Batch job 162 submitted Job 162 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./ex1 8 (family=openmpi5)
Test case: | [libs/HYPRE] 2 PE PCG with SMG preconditioner binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 2.134 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 32) `run_mpi_binary ./ex2 $ARGS $NODES $TASKS' failed job script = /tmp/job.ohpc-test.25951 Batch job 163 submitted Job 163 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./ex2 8 (family=openmpi5)
Test case: | [libs/HYPRE] 2 PE Semi-Structured PCG binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 4.193 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 41) `run_mpi_binary ./ex6 "" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.17622 Batch job 164 submitted Job 164 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./ex6 (family=openmpi5)
Test case: | [libs/HYPRE] 2 PE Three-part stencil binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 3.163 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 50) `run_mpi_binary ./ex8 "" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.25280 Batch job 165 submitted Job 165 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./ex8 (family=openmpi5)
Test case: | [libs/HYPRE] 2 PE FORTRAN PCG with PFMG preconditioner binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 2.135 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 59) `run_mpi_binary ./ex12f "" 1 2' failed job script = /tmp/job.ohpc-test.12790 Batch job 166 submitted Job 166 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./ex12f (family=openmpi5) ./ex12f: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory ./ex12f: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory -------------------------------------------------------------------------- prterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prterun-c1-5772@1,0] Exit code: 127 --------------------------------------------------------------------------
Test case: | [libs/HYPRE] 2 PE -Delta u = 1 binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 4.193 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 68) `run_mpi_binary ./ex3 "-n 33 -solver 0 -v 1 1" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.26307 Batch job 167 submitted Job 167 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./ex3 -n 33 -solver 0 -v 1 1 (family=openmpi5)
Test case: | [libs/HYPRE] 2 PE convection-reaction-diffusion binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 1.105 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 77) `run_mpi_binary ./ex4 "-n 33 -solver 10 -K 3 -B 0 -C 1 -U0 2 -F 4" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.15687 Batch job 168 submitted Job 168 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./ex4 -n 33 -solver 10 -K 3 -B 0 -C 1 -U0 2 -F 4 (family=openmpi5)
Test case: | [libs/HYPRE] 2 PE FORTRAN 2-D Laplacian binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 4.194 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 86) `run_mpi_binary ./ex5f "" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.3309 Batch job 169 submitted Job 169 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./ex5f (family=openmpi5)
Test case: | [libs/HYPRE] 2 PE Semi-Structured convection binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Skipped |
Duration: | 0.0 sec |
Failed | None |
Skipped | None |
None
skipped
Test case: | [libs/HYPRE] 2 PE biharmonic binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Skipped |
Duration: | 0.0 sec |
Failed | None |
Skipped | None |
None
skipped
Test case: | [libs/HYPRE] 2 PE C++ Finite Element Interface binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Skipped |
Duration: | 0.0 sec |
Failed | None |
Skipped | None |
None
C++ example depends on non-installed header
Test case: | [libs/HYPRE] 2 PE 2-D Laplacian eigenvalue binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 3.165 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 125) `run_mpi_binary ./ex11 "" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.12946 Batch job 170 submitted Job 170 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./ex11 (family=openmpi5)
Duration | 1.288 sec |
---|---|
Tests | 9 |
Failures | 1 |
Test case: | [libs/HYPRE] Verify HYPRE module is loaded and matches rpm version (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.119 sec |
Failed | None |
None
Test case: | [libs/HYPRE] Verify module HYPRE_DIR is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/HYPRE] Verify module HYPRE_BIN is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.011 sec |
Failed | None |
None
Test case: | [libs/HYPRE] Verify module HYPRE_LIB is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/HYPRE] Verify dynamic library available in HYPRE_LIB (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/HYPRE] Verify static library is not present in HYPRE_LIB (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/HYPRE] Verify module HYPRE_INC is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/HYPRE] Verify header file is present in HYPRE_INC (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/HYPRE] Sample job (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 1.103 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file test_module, line 131) `run_mpi_binary ./ex1 "atest" 1 1' failed job script = /tmp/job.ohpc-test.1390 Batch job 161 submitted Job 161 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./ex1 atest (family=openmpi5) ./ex1: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
Duration | 1347.965 sec |
---|---|
Tests | 12 |
Failures | 9 |
Test case: | [libs/HYPRE] 2 PE structured test binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 149.206 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 23) `run_mpi_binary ./ex1 $ARGS $NODES $TASKS' failed job script = /tmp/job.ohpc-test.12572 Batch job 152 submitted Job 152 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex1 8 (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 152 ON c1 CANCELLED AT 2024-10-21T19:47:33 DUE TO TIME LIMIT ***
Test case: | [libs/HYPRE] 2 PE PCG with SMG preconditioner binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 149.209 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 32) `run_mpi_binary ./ex2 $ARGS $NODES $TASKS' failed job script = /tmp/job.ohpc-test.25600 Batch job 153 submitted Job 153 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex2 8 (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 153 ON c1 CANCELLED AT 2024-10-21T19:50:03 DUE TO TIME LIMIT ***
Test case: | [libs/HYPRE] 2 PE Semi-Structured PCG binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 150.235 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 41) `run_mpi_binary ./ex6 "" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.7501 Batch job 154 submitted Job 154 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex6 (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 154 ON c1 CANCELLED AT 2024-10-21T19:52:32 DUE TO TIME LIMIT ***
Test case: | [libs/HYPRE] 2 PE Three-part stencil binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 150.236 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 50) `run_mpi_binary ./ex8 "" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.3131 Batch job 155 submitted Job 155 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex8 (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 155 ON c1 CANCELLED AT 2024-10-21T19:55:02 DUE TO TIME LIMIT ***
Test case: | [libs/HYPRE] 2 PE FORTRAN PCG with PFMG preconditioner binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 151.287 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 59) `run_mpi_binary ./ex12f "" 1 2' failed job script = /tmp/job.ohpc-test.31799 Batch job 156 submitted Job 156 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex12f (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 156 ON c1 CANCELLED AT 2024-10-21T19:57:33 DUE TO TIME LIMIT ***
Test case: | [libs/HYPRE] 2 PE -Delta u = 1 binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 149.214 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 68) `run_mpi_binary ./ex3 "-n 33 -solver 0 -v 1 1" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.18389 Batch job 157 submitted Job 157 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex3 -n 33 -solver 0 -v 1 1 (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 157 ON c1 CANCELLED AT 2024-10-21T20:00:03 DUE TO TIME LIMIT ***
Test case: | [libs/HYPRE] 2 PE convection-reaction-diffusion binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 149.202 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 77) `run_mpi_binary ./ex4 "-n 33 -solver 10 -K 3 -B 0 -C 1 -U0 2 -F 4" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.9574 Batch job 158 submitted Job 158 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex4 -n 33 -solver 10 -K 3 -B 0 -C 1 -U0 2 -F 4 (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 158 ON c1 CANCELLED AT 2024-10-21T20:02:33 DUE TO TIME LIMIT ***
Test case: | [libs/HYPRE] 2 PE FORTRAN 2-D Laplacian binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 150.233 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 86) `run_mpi_binary ./ex5f "" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.26466 Batch job 159 submitted Job 159 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex5f (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 159 ON c1 CANCELLED AT 2024-10-21T20:05:02 DUE TO TIME LIMIT ***
Test case: | [libs/HYPRE] 2 PE Semi-Structured convection binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Skipped |
Duration: | 0.0 sec |
Failed | None |
Skipped | None |
None
skipped
Test case: | [libs/HYPRE] 2 PE biharmonic binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Skipped |
Duration: | 0.0 sec |
Failed | None |
Skipped | None |
None
skipped
Test case: | [libs/HYPRE] 2 PE C++ Finite Element Interface binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Skipped |
Duration: | 0.0 sec |
Failed | None |
Skipped | None |
None
C++ example depends on non-installed header
Test case: | [libs/HYPRE] 2 PE 2-D Laplacian eigenvalue binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 149.143 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 125) `run_mpi_binary ./ex11 "" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.26465 Batch job 160 submitted Job 160 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex11 (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 160 ON c1 CANCELLED AT 2024-10-21T20:07:33 DUE TO TIME LIMIT ***
Duration | 150.318 sec |
---|---|
Tests | 9 |
Failures | 1 |
Test case: | [libs/HYPRE] Verify HYPRE module is loaded and matches rpm version (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.113 sec |
Failed | None |
None
Test case: | [libs/HYPRE] Verify module HYPRE_DIR is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/HYPRE] Verify module HYPRE_BIN is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.011 sec |
Failed | None |
None
Test case: | [libs/HYPRE] Verify module HYPRE_LIB is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/HYPRE] Verify dynamic library available in HYPRE_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/HYPRE] Verify static library is not present in HYPRE_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/HYPRE] Verify module HYPRE_INC is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/HYPRE] Verify header file is present in HYPRE_INC (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/HYPRE] Sample job (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 150.139 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file test_module, line 131) `run_mpi_binary ./ex1 "atest" 1 1' failed job script = /tmp/job.ohpc-test.17804 Batch job 151 submitted Job 151 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex1 atest (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 151 ON c1 CANCELLED AT 2024-10-21T19:45:03 DUE TO TIME LIMIT ***
Duration | 1.718 sec |
---|---|
Tests | 11 |
Failures | 0 |
Test case: | [Boost/Program Options] cmdline_test under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.192 sec |
Failed | None |
None
Test case: | [Boost/Program Options] exception_test under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.207 sec |
Failed | None |
None
Test case: | [Boost/Program Options] options_description_test under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.182 sec |
Failed | None |
None
Test case: | [Boost/Program Options] parsers_test on master host(gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.013 sec |
Failed | None |
None
Test case: | [Boost/Program Options] parsers_test under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.217 sec |
Failed | None |
None
Test case: | [Boost/Program Options] positional_options_test under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.163 sec |
Failed | None |
None
Test case: | [Boost/Program Options] required_test on master host(gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.013 sec |
Failed | None |
None
Test case: | [Boost/Program Options] required_test under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.159 sec |
Failed | None |
None
Test case: | [Boost/Program Options] unicode_test under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.193 sec |
Failed | None |
None
Duration | 1.629 sec |
---|---|
Tests | 11 |
Failures | 0 |
Test case: | [Boost/Program Options] cmdline_test under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.183 sec |
Failed | None |
None
Test case: | [Boost/Program Options] exception_test under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.238 sec |
Failed | None |
None
Test case: | [Boost/Program Options] options_description_test under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.162 sec |
Failed | None |
None
Test case: | [Boost/Program Options] parsers_test on master host(gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.012 sec |
Failed | None |
None
Test case: | [Boost/Program Options] parsers_test under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.103 sec |
Failed | None |
None
Test case: | [Boost/Program Options] positional_options_test under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.2 sec |
Failed | None |
None
Test case: | [Boost/Program Options] required_test on master host(gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.012 sec |
Failed | None |
None
Test case: | [Boost/Program Options] required_test under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.178 sec |
Failed | None |
None
Test case: | [Boost/Program Options] unicode_test under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.186 sec |
Failed | None |
None
Duration | 0.754 sec |
---|---|
Tests | 4 |
Failures | 0 |
Test case: | [Boost/Accumulators] min-test under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.181 sec |
Failed | None |
None
Test case: | [Boost/Accumulators] max-test under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.164 sec |
Failed | None |
None
Duration | 0.798 sec |
---|---|
Tests | 4 |
Failures | 0 |
Test case: | [Boost/Accumulators] min-test under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.2 sec |
Failed | None |
None
Test case: | [Boost/Accumulators] max-test under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.209 sec |
Failed | None |
None
Duration | 2.279 sec |
---|---|
Tests | 4 |
Failures | 0 |
Test case: | [Boost/Random] test_piecewise_linear under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 1.501 sec |
Failed | None |
None
Test case: | [Boost/Random] test_discrete under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.412 sec |
Failed | None |
None
Duration | 2.271 sec |
---|---|
Tests | 4 |
Failures | 0 |
Test case: | [Boost/Random] test_piecewise_linear under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 1.501 sec |
Failed | None |
None
Test case: | [Boost/Random] test_discrete under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.4 sec |
Failed | None |
None
Duration | 0.627 sec |
---|---|
Tests | 4 |
Failures | 0 |
Test case: | [Boost/Multi Array] access-test under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.165 sec |
Failed | None |
None
Test case: | [Boost/Multi Array] iterators-test under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.137 sec |
Failed | None |
None
Duration | 0.705 sec |
---|---|
Tests | 4 |
Failures | 0 |
Test case: | [Boost/Multi Array] access-test under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.172 sec |
Failed | None |
None
Test case: | [Boost/Multi Array] iterators-test under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.164 sec |
Failed | None |
None
Duration | 0.554 sec |
---|---|
Tests | 3 |
Failures | 3 |
Test case: | [Boost] bad_expression_test under resource manager (slurm/gnu14) |
---|---|
Outcome: | Failed |
Duration: | 0.178 sec |
Failed | None |
(from function `run_serial_binary' in file ./common/functions, line 236, in test file rm_execution, line 21) `run_serial_binary ./$test' failed with status 127 /home/ohpc-test/tests/libs/boost/tests/regex/test/./bad_expression_test: error while loading shared libraries: libicudata.so.67: cannot open shared object file: No such file or directory srun: error: c1: task 0: Exited with exit code 127
Test case: | [Boost] named_subexpressions_test under resource manager (slurm/gnu14) |
---|---|
Outcome: | Failed |
Duration: | 0.208 sec |
Failed | None |
(from function `run_serial_binary' in file ./common/functions, line 236, in test file rm_execution, line 30) `run_serial_binary ./$test' failed with status 127 /home/ohpc-test/tests/libs/boost/tests/regex/test/./named_subexpressions_test: error while loading shared libraries: libicudata.so.67: cannot open shared object file: No such file or directory srun: error: c2: task 0: Exited with exit code 127
Test case: | [Boost] recursion_test under resource manager (slurm/gnu14) |
---|---|
Outcome: | Failed |
Duration: | 0.168 sec |
Failed | None |
(from function `run_serial_binary' in file ./common/functions, line 236, in test file rm_execution, line 39) `run_serial_binary ./$test config_test.cfg ' failed with status 127 /home/ohpc-test/tests/libs/boost/tests/regex/test/./recursion_test: error while loading shared libraries: libicudata.so.67: cannot open shared object file: No such file or directory srun: error: c1: task 0: Exited with exit code 127
Duration | 0.2 sec |
---|---|
Tests | 1 |
Failures | 1 |
Test case: | [Boost] regress under resource manager (slurm/gnu14) |
---|---|
Outcome: | Failed |
Duration: | 0.2 sec |
Failed | None |
(from function `run_serial_binary' in file ../../../../../../common/functions, line 236, in test file rm_execution, line 21) `run_serial_binary ./$test' failed with status 127 /home/ohpc-test/tests/libs/boost/tests/regex/test/regress/./regress: error while loading shared libraries: libicudata.so.67: cannot open shared object file: No such file or directory srun: error: c1: task 0: Exited with exit code 127
Duration | 0.544 sec |
---|---|
Tests | 3 |
Failures | 3 |
Test case: | [Boost] bad_expression_test under resource manager (slurm/gnu14) |
---|---|
Outcome: | Failed |
Duration: | 0.176 sec |
Failed | None |
(from function `run_serial_binary' in file ./common/functions, line 236, in test file rm_execution, line 21) `run_serial_binary ./$test' failed with status 127 /home/ohpc-test/tests/libs/boost/tests/regex/test/./bad_expression_test: error while loading shared libraries: libicudata.so.67: cannot open shared object file: No such file or directory srun: error: c1: task 0: Exited with exit code 127
Test case: | [Boost] named_subexpressions_test under resource manager (slurm/gnu14) |
---|---|
Outcome: | Failed |
Duration: | 0.2 sec |
Failed | None |
(from function `run_serial_binary' in file ./common/functions, line 236, in test file rm_execution, line 30) `run_serial_binary ./$test' failed with status 127 /home/ohpc-test/tests/libs/boost/tests/regex/test/./named_subexpressions_test: error while loading shared libraries: libicudata.so.67: cannot open shared object file: No such file or directory srun: error: c2: task 0: Exited with exit code 127
Test case: | [Boost] recursion_test under resource manager (slurm/gnu14) |
---|---|
Outcome: | Failed |
Duration: | 0.168 sec |
Failed | None |
(from function `run_serial_binary' in file ./common/functions, line 236, in test file rm_execution, line 39) `run_serial_binary ./$test config_test.cfg ' failed with status 127 /home/ohpc-test/tests/libs/boost/tests/regex/test/./recursion_test: error while loading shared libraries: libicudata.so.67: cannot open shared object file: No such file or directory srun: error: c1: task 0: Exited with exit code 127
Duration | 4.505 sec |
---|---|
Tests | 8 |
Failures | 0 |
Test case: | [BOOST] Verify BOOST module is loaded and matches rpm version (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.129 sec |
Failed | None |
None
Test case: | [BOOST] Verify module BOOST_DIR is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.011 sec |
Failed | None |
None
Test case: | [BOOST] Verify module BOOST_LIB is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [BOOST] Verify dynamic library available in BOOST_LIB (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [BOOST] Verify static library is not present in BOOST_LIB (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [BOOST] Verify module BOOST_INC is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 4.513 sec |
---|---|
Tests | 8 |
Failures | 0 |
Test case: | [BOOST] Verify BOOST module is loaded and matches rpm version (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.123 sec |
Failed | None |
None
Test case: | [BOOST] Verify module BOOST_DIR is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.011 sec |
Failed | None |
None
Test case: | [BOOST] Verify module BOOST_LIB is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [BOOST] Verify dynamic library available in BOOST_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [BOOST] Verify static library is not present in BOOST_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [BOOST] Verify module BOOST_INC is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 79.896 sec |
---|---|
Tests | 50 |
Failures | 0 |
Test case: | [libs/GSL] run test_gsl_histogram (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.018 sec |
Failed | None |
None
Test case: | [libs/GSL] run block under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.71 sec |
Failed | None |
None
Test case: | [libs/GSL] run bspline under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.722 sec |
Failed | None |
None
Test case: | [libs/GSL] run cblas under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.168 sec |
Failed | None |
None
Test case: | [libs/GSL] run cdf under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.162 sec |
Failed | None |
None
Test case: | [libs/GSL] run cheb under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.168 sec |
Failed | None |
None
Test case: | [libs/GSL] run combination under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.152 sec |
Failed | None |
None
Test case: | [libs/GSL] run complex under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.143 sec |
Failed | None |
None
Test case: | [libs/GSL] run const under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.15 sec |
Failed | None |
None
Test case: | [libs/GSL] run deriv under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.159 sec |
Failed | None |
None
Test case: | [libs/GSL] run dht under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.287 sec |
Failed | None |
None
Test case: | [libs/GSL] run diff under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.161 sec |
Failed | None |
None
Test case: | [libs/GSL] run eigen under resource manager (slurm/gnu14) |
---|---|
Outcome: | Skipped |
Duration: | 0.0 sec |
Failed | None |
Skipped | None |
None
Skipping eigen test for ARCH=aarch64
Test case: | [libs/GSL] run err under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.17 sec |
Failed | None |
None
Test case: | [libs/GSL] run fft under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.599 sec |
Failed | None |
None
Test case: | [libs/GSL] run fit under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.146 sec |
Failed | None |
None
Test case: | [libs/GSL] run histogram under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.309 sec |
Failed | None |
None
Test case: | [libs/GSL] run ieee-utils under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.175 sec |
Failed | None |
None
Test case: | [libs/GSL] run integration under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 3.064 sec |
Failed | None |
None
Test case: | [libs/GSL] run interpolation under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.186 sec |
Failed | None |
None
Test case: | [libs/GSL] run linalg under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 1.839 sec |
Failed | None |
None
Test case: | [libs/GSL] run matrix under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 2.728 sec |
Failed | None |
None
Test case: | [libs/GSL] run min under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.178 sec |
Failed | None |
None
Test case: | [libs/GSL] run monte under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 1.767 sec |
Failed | None |
None
Test case: | [libs/GSL] run multifit under resource manager (slurm/gnu14) |
---|---|
Outcome: | Skipped |
Duration: | 0.0 sec |
Failed | None |
Skipped | None |
None
Skipping multifit test for ARCH=aarch64
Test case: | [libs/GSL] run multilarge under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.203 sec |
Failed | None |
None
Test case: | [libs/GSL] run multimin under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.162 sec |
Failed | None |
None
Test case: | [libs/GSL] run multiroots under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.153 sec |
Failed | None |
None
Test case: | [libs/GSL] run multiset under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.2 sec |
Failed | None |
None
Test case: | [libs/GSL] run ntuple under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.146 sec |
Failed | None |
None
Test case: | [libs/GSL] run ode-initval under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.618 sec |
Failed | None |
None
Test case: | [libs/GSL] run ode-initval2 under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 9.457 sec |
Failed | None |
None
Test case: | [libs/GSL] run permutation under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.138 sec |
Failed | None |
None
Test case: | [libs/GSL] run poly under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.17 sec |
Failed | None |
None
Test case: | [libs/GSL] run qrng under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.196 sec |
Failed | None |
None
Test case: | [libs/GSL] run randist under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 1.655 sec |
Failed | None |
None
Test case: | [libs/GSL] run rng under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 3.457 sec |
Failed | None |
None
Test case: | [libs/GSL] run roots under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.16 sec |
Failed | None |
None
Test case: | [libs/GSL] run rstat under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 6.904 sec |
Failed | None |
None
Test case: | [libs/GSL] run siman under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 1.337 sec |
Failed | None |
None
Test case: | [libs/GSL] run sort under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.201 sec |
Failed | None |
None
Test case: | [libs/GSL] run spblas under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.379 sec |
Failed | None |
None
Test case: | [libs/GSL] run specfunc under resource manager (slurm/gnu14) |
---|---|
Outcome: | Skipped |
Duration: | 0.0 sec |
Failed | None |
Skipped | None |
None
Skipping specfun test for ARCH=aarch64
Test case: | [libs/GSL] run splinalg under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.43 sec |
Failed | None |
None
Test case: | [libs/GSL] run spmatrix under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.184 sec |
Failed | None |
None
Test case: | [libs/GSL] run statistics under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.172 sec |
Failed | None |
None
Test case: | [libs/GSL] run sum under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.17 sec |
Failed | None |
None
Test case: | [libs/GSL] run sys under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.164 sec |
Failed | None |
None
Duration | 0.149 sec |
---|---|
Tests | 7 |
Failures | 0 |
Test case: | [libs/GSL] Verify GSL module is loaded and matches rpm version (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.094 sec |
Failed | None |
None
Test case: | [libs/GSL] Verify module GSL_DIR is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/GSL] Verify module GSL_LIB is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/GSL] Verify dynamic library available in GSL_LIB (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/GSL] Verify static library is not present in GSL_LIB (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 0.2 sec |
---|---|
Tests | 7 |
Failures | 0 |
Test case: | [libs/MFEM] Verify MFEM module is loaded and matches rpm version (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.14 sec |
Failed | None |
None
Test case: | [libs/MFEM] Verify module MFEM_DIR is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.011 sec |
Failed | None |
None
Test case: | [libs/MFEM] Verify module MFEM_BIN is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.011 sec |
Failed | None |
None
Test case: | [libs/MFEM] Verify module MFEM_LIB is defined and exists (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/MFEM] Verify (dynamic) library available in MFEM_LIB (gnu14/openmpi5) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 9.617 sec |
---|---|
Tests | 4 |
Failures | 4 |
Test case: | [libs/MFEM] p_laplace MPI binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 2.146 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 24) `run_mpi_binary ./${binary} "-no-vis" $NODES $TASKS ' failed job script = /tmp/job.ohpc-test.6109 Batch job 175 submitted Job 175 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./p_laplace -no-vis (family=openmpi5)
Test case: | [libs/MFEM] p_laplace_perf MPI binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 1.114 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 34) `run_mpi_binary ./${binary} "-no-vis -rs 2" $NODES $TASKS ' failed job script = /tmp/job.ohpc-test.14412 Batch job 176 submitted Job 176 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./p_laplace_perf -no-vis -rs 2 (family=openmpi5)
Test case: | [libs/MFEM] p_cantilever MPI binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 3.178 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 44) `run_mpi_binary ./${binary} "-no-vis" $NODES $TASKS ' failed job script = /tmp/job.ohpc-test.22749 Batch job 177 submitted Job 177 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./p_cantilever -no-vis (family=openmpi5)
Test case: | [libs/MFEM] p_diffusion MPI binary runs under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 3.179 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 54) `run_mpi_binary ./${binary} "-no-vis" $NODES $TASKS ' failed job script = /tmp/job.ohpc-test.15697 Batch job 178 submitted Job 178 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./p_diffusion -no-vis (family=openmpi5)
Duration | 597.952 sec |
---|---|
Tests | 4 |
Failures | 4 |
Test case: | [libs/MFEM] p_laplace MPI binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 147.697 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 24) `run_mpi_binary ./${binary} "-no-vis" $NODES $TASKS ' failed job script = /tmp/job.ohpc-test.26106 Batch job 171 submitted Job 171 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./p_laplace -no-vis (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 171 ON c1 CANCELLED AT 2024-10-21T20:13:33 DUE TO TIME LIMIT ***
Test case: | [libs/MFEM] p_laplace_perf MPI binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 150.778 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 34) `run_mpi_binary ./${binary} "-no-vis -rs 2" $NODES $TASKS ' failed job script = /tmp/job.ohpc-test.25249 Batch job 172 submitted Job 172 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./p_laplace_perf -no-vis -rs 2 (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 172 ON c1 CANCELLED AT 2024-10-21T20:16:03 DUE TO TIME LIMIT ***
Test case: | [libs/MFEM] p_cantilever MPI binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 148.712 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 44) `run_mpi_binary ./${binary} "-no-vis" $NODES $TASKS ' failed job script = /tmp/job.ohpc-test.12848 Batch job 173 submitted Job 173 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./p_cantilever -no-vis (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 173 ON c1 CANCELLED AT 2024-10-21T20:18:32 DUE TO TIME LIMIT ***
Test case: | [libs/MFEM] p_diffusion MPI binary runs under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 150.765 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution, line 54) `run_mpi_binary ./${binary} "-no-vis" $NODES $TASKS ' failed job script = /tmp/job.ohpc-test.6129 Batch job 174 submitted Job 174 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./p_diffusion -no-vis (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 174 ON c1 CANCELLED AT 2024-10-21T20:21:02 DUE TO TIME LIMIT ***
Duration | 0.193 sec |
---|---|
Tests | 7 |
Failures | 0 |
Test case: | [libs/MFEM] Verify MFEM module is loaded and matches rpm version (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.134 sec |
Failed | None |
None
Test case: | [libs/MFEM] Verify module MFEM_DIR is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/MFEM] Verify module MFEM_BIN is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.011 sec |
Failed | None |
None
Test case: | [libs/MFEM] Verify module MFEM_LIB is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/MFEM] Verify (dynamic) library available in MFEM_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 132.021 sec |
---|---|
Tests | 9 |
Failures | 1 |
Test case: | [libs/PETSc] Verify PETSC module is loaded and matches rpm version (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.126 sec |
Failed | None |
None
Test case: | [libs/PETSc] Verify module PETSC_DIR is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.011 sec |
Failed | None |
None
Test case: | [libs/PETSc] Verify module PETSC_BIN is defined and exists |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [libs/PETSc] Verify module PETSC_LIB is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/PETSc] Verify dynamic library available in PETSC_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/PETSc] Verify static library is not present in PETSC_LIB (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/PETSc] Verify module PETSC_INC is defined and exists (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/PETSc] Verify header file is present in PETSC_INC (gnu14/mpich) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [libs/PETSc] Sample job (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 131.829 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file test_module, line 130) `run_mpi_binary ./C_test "atest" 1 1' failed job script = /tmp/job.ohpc-test.27107 Batch job 226 submitted Job 226 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./C_test atest (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 226 ON c1 CANCELLED AT 2024-10-21T20:51:03 DUE TO TIME LIMIT ***
Duration | 3.169 sec |
---|---|
Tests | 2 |
Failures | 2 |
Test case: | [Apps/miniFE] run miniFE on multi nodes under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 3.156 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common-test/functions, line 414, in test file rm_execution_multi_host, line 46) `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.2881 Batch job 15 submitted Job 15 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./src/miniFE.x.gnu14.openmpi5 nx=256 ny=256 nz=256 verify_solution=0 (family=openmpi5)
Test case: | [Apps/miniFE] log miniFE multi node results (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 0.013 sec |
Failed | None |
(in test file rm_execution_multi_host, line 54) `mv $run_yaml $wrk_yaml || exit 1' failed ls: cannot access 'miniFE.256x256x256.P16.*.yaml': No such file or directory mv: missing destination file operand after 'miniFE.256x256x256.P16.gnu14.openmpi5.yaml' Try 'mv --help' for more information.
Duration | 1.119 sec |
---|---|
Tests | 2 |
Failures | 2 |
Test case: | [Apps/miniFE] run miniFE on single node under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 1.102 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common-test/functions, line 414, in test file rm_execution_single_host, line 46) `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.23582 Batch job 14 submitted Job 14 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./src/miniFE.x.gnu14.openmpi5 nx=100 ny=100 nz=100 verify_solution=1 (family=openmpi5) ./src/miniFE.x.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory ./src/miniFE.x.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory ./src/miniFE.x.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory ./src/miniFE.x.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory ./src/miniFE.x.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory ./src/miniFE.x.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory ./src/miniFE.x.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory -------------------------------------------------------------------------- prterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prterun-c1-3126@1,2] Exit code: 127 --------------------------------------------------------------------------
Test case: | [Apps/miniFE] log miniFE single node results (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 0.017 sec |
Failed | None |
(from function `flunk' in file ./common-test/test_helper_functions.bash, line 14, in test file rm_execution_single_host, line 55) `mv $run_yaml $wrk_yaml || flunk "Unable to move ${run_yaml} file to ${work_yaml}"' failed ls: cannot access 'miniFE.100x100x100.P8.*.yaml': No such file or directory mv: missing destination file operand after 'miniFE.100x100x100.P8.gnu14.openmpi5.yaml' Try 'mv --help' for more information. Unable to move file to
Duration | 330.647 sec |
---|---|
Tests | 2 |
Failures | 2 |
Test case: | [Apps/miniFE] run miniFE on multi nodes under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 330.634 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common-test/functions, line 414, in test file rm_execution_multi_host, line 46) `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.31660 Batch job 13 submitted Job 13 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./src/miniFE.x.gnu14.mpich nx=256 ny=256 nz=256 verify_solution=0 (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 13 ON c1 CANCELLED AT 2024-10-21T18:57:03 DUE TO TIME LIMIT ***
Test case: | [Apps/miniFE] log miniFE multi node results (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 0.013 sec |
Failed | None |
(in test file rm_execution_multi_host, line 54) `mv $run_yaml $wrk_yaml || exit 1' failed ls: cannot access 'miniFE.256x256x256.P16.*.yaml': No such file or directory mv: missing destination file operand after 'miniFE.256x256x256.P16.gnu14.mpich.yaml' Try 'mv --help' for more information.
Duration | 329.687 sec |
---|---|
Tests | 2 |
Failures | 2 |
Test case: | [Apps/miniFE] run miniFE on single node under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 329.671 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common-test/functions, line 414, in test file rm_execution_single_host, line 46) `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.27621 Batch job 12 submitted Job 12 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./src/miniFE.x.gnu14.mpich nx=100 ny=100 nz=100 verify_solution=1 (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 12 ON c1 CANCELLED AT 2024-10-21T18:51:32 DUE TO TIME LIMIT ***
Test case: | [Apps/miniFE] log miniFE single node results (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 0.016 sec |
Failed | None |
(from function `flunk' in file ./common-test/test_helper_functions.bash, line 14, in test file rm_execution_single_host, line 55) `mv $run_yaml $wrk_yaml || flunk "Unable to move ${run_yaml} file to ${work_yaml}"' failed ls: cannot access 'miniFE.100x100x100.P8.*.yaml': No such file or directory mv: missing destination file operand after 'miniFE.100x100x100.P8.gnu14.mpich.yaml' Try 'mv --help' for more information. Unable to move file to
Duration | 2.153 sec |
---|---|
Tests | 3 |
Failures | 2 |
Test case: | [/apps/hpcg] check if resource manager is defined |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [/apps/hpcg] run HPCG on multi nodes under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 2.129 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution_multi_host, line 46) `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.1798 Batch job 11 submitted Job 11 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./bin/xhpcg.gnu14.openmpi5 32 32 32 10 (family=openmpi5)
Test case: | [/apps/hpcg] log HPCG multi node results (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 0.015 sec |
Failed | None |
(in test file rm_execution_multi_host, line 55) `mv $run_yaml $wrk_yaml' failed Finding latest HPCG-Bencmark-*.yaml in /home/ohpc-test/tests/apps/hpcg ls: cannot access 'HPCG-Benchmark-*.yaml': No such file or directory Moving to HPCG.32x32x32.P16.gnu14.openmpi5.yaml mv: missing destination file operand after 'HPCG.32x32x32.P16.gnu14.openmpi5.yaml' Try 'mv --help' for more information.
Duration | 1.127 sec |
---|---|
Tests | 3 |
Failures | 2 |
Test case: | [/apps/hpcg] check if resource manager is defined |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [/apps/hpcg] run HPCG on single node under resource manager (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 1.103 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution_single_host, line 46) `run_mpi_binary -t $CMD_TIMEOUT ${EXE} "$ARGS" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.24985 Batch job 10 submitted Job 10 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- ./bin/xhpcg.gnu14.openmpi5 32 32 32 10 (family=openmpi5) ./bin/xhpcg.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory ./bin/xhpcg.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory ./bin/xhpcg.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory ./bin/xhpcg.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory ./bin/xhpcg.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory ./bin/xhpcg.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory ./bin/xhpcg.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory ./bin/xhpcg.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory -------------------------------------------------------------------------- prterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prterun-c1-2922@1,0] Exit code: 127 --------------------------------------------------------------------------
Test case: | [/apps/hpcg] log HPCG single node results (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 0.015 sec |
Failed | None |
(in test file rm_execution_single_host, line 55) `mv $run_yaml $wrk_yaml' failed Finding latest HPCG-Bencmark-*.yaml in /home/ohpc-test/tests/apps/hpcg ls: cannot access 'HPCG-Benchmark-*.yaml': No such file or directory Moving to HPCG.32x32x32.P8.gnu14.openmpi5.yaml mv: missing destination file operand after 'HPCG.32x32x32.P8.gnu14.openmpi5.yaml' Try 'mv --help' for more information.
Duration | 330.639 sec |
---|---|
Tests | 3 |
Failures | 2 |
Test case: | [/apps/hpcg] check if resource manager is defined |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Test case: | [/apps/hpcg] run HPCG on multi nodes under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 330.616 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution_multi_host, line 46) `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.19567 Batch job 9 submitted Job 9 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./bin/xhpcg.gnu14.mpich 32 32 32 10 (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 9 ON c1 CANCELLED AT 2024-10-21T18:45:34 DUE TO TIME LIMIT ***
Test case: | [/apps/hpcg] log HPCG multi node results (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 0.014 sec |
Failed | None |
(in test file rm_execution_multi_host, line 55) `mv $run_yaml $wrk_yaml' failed Finding latest HPCG-Bencmark-*.yaml in /home/ohpc-test/tests/apps/hpcg ls: cannot access 'HPCG-Benchmark-*.yaml': No such file or directory Moving to HPCG.32x32x32.P16.gnu14.mpich.yaml mv: missing destination file operand after 'HPCG.32x32x32.P16.gnu14.mpich.yaml' Try 'mv --help' for more information.
Duration | 308.209 sec |
---|---|
Tests | 3 |
Failures | 2 |
Test case: | [/apps/hpcg] check if resource manager is defined |
---|---|
Outcome: | Passed |
Duration: | 0.008 sec |
Failed | None |
None
Test case: | [/apps/hpcg] run HPCG on single node under resource manager (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 308.186 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file rm_execution_single_host, line 46) `run_mpi_binary -t $CMD_TIMEOUT ${EXE} "$ARGS" $NODES $TASKS' failed job script = /tmp/job.ohpc-test.17027 Batch job 8 submitted Job 8 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./bin/xhpcg.gnu14.mpich 32 32 32 10 (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 8 ON c1 CANCELLED AT 2024-10-21T18:40:02 DUE TO TIME LIMIT ***
Test case: | [/apps/hpcg] log HPCG single node results (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 0.015 sec |
Failed | None |
(in test file rm_execution_single_host, line 55) `mv $run_yaml $wrk_yaml' failed Finding latest HPCG-Bencmark-*.yaml in /home/ohpc-test/tests/apps/hpcg ls: cannot access 'HPCG-Benchmark-*.yaml': No such file or directory Moving to HPCG.32x32x32.P8.gnu14.mpich.yaml mv: missing destination file operand after 'HPCG.32x32x32.P8.gnu14.mpich.yaml' Try 'mv --help' for more information.
Duration | 0.248 sec |
---|---|
Tests | 3 |
Failures | 0 |
Duration | 1.006 sec |
---|---|
Tests | 6 |
Failures | 0 |
Test case: | [Compilers] C binary runs under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.152 sec |
Failed | None |
None
Test case: | [Compilers] C++ binary runs under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.166 sec |
Failed | None |
None
Test case: | [Compilers] Fortran binary runs under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.185 sec |
Failed | None |
None
Test case: | [Compilers] C openmp binary runs under resource manager (slurm/gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.18 sec |
Failed | None |
None
Duration | 0.068 sec |
---|---|
Tests | 3 |
Failures | 0 |
Duration | 0.143 sec |
---|---|
Tests | 7 |
Failures | 0 |
Test case: | [HWLOC] Verify HWLOC module is loaded and matches rpm version (gnu14/) |
---|---|
Outcome: | Passed |
Duration: | 0.093 sec |
Failed | None |
None
Test case: | [HWLOC] Verify module HWLOC_DIR is defined and exists (gnu14/) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [HWLOC] Verify module HWLOC_LIB is defined and exists (gnu14/) |
---|---|
Outcome: | Passed |
Duration: | 0.008 sec |
Failed | None |
None
Test case: | [HWLOC] Verify dynamic library available in HWLOC_LIB (gnu14/) |
---|---|
Outcome: | Passed |
Duration: | 0.008 sec |
Failed | None |
None
Test case: | [HWLOC] Verify static library is not present in HWLOC_LIB (gnu14/) |
---|---|
Outcome: | Passed |
Duration: | 0.008 sec |
Failed | None |
None
Duration | 5.067 sec |
---|---|
Tests | 8 |
Failures | 0 |
Test case: | [Valgrind] Verify valgrind module is loaded and matches rpm version (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.093 sec |
Failed | None |
None
Test case: | [Valgrind] Verify module VALGRIND_DIR is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Test case: | [Valgrind] Verify availability of valgrind binary (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.016 sec |
Failed | None |
None
Test case: | [Valgrind] Verify availability of man page (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.026 sec |
Failed | None |
None
Test case: | [Valgrind] Verify module VALGRIND_INC is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.008 sec |
Failed | None |
None
Test case: | [Valgrind] Verify header file is present in VALGRIND_INC (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.009 sec |
Failed | None |
None
Duration | 0.027 sec |
---|---|
Tests | 3 |
Failures | 0 |
Duration | 137.685 sec |
---|---|
Tests | 1 |
Failures | 1 |
Test case: | [dev-tools/py3-mpi4py] python hello world (slurm/gnu14/mpich) |
---|---|
Outcome: | Failed |
Duration: | 137.685 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file hipy, line 35) `run_mpi_binary "${_python} helloworld.py" $ARGS $NODES $TASKS' failed job script = /tmp/job.ohpc-test.9454 Batch job 24 submitted Job 24 encountered timeout... Reason=TimeLimit [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpiexec.hydra -bootstrap slurm python3 helloworld.py 8 (family=mpich) /bin/srun: unrecognized option '--external-launcher' Try "srun --help" for more information slurmstepd-c1: error: *** JOB 24 ON c1 CANCELLED AT 2024-10-21T19:00:03 DUE TO TIME LIMIT ***
Duration | 0.027 sec |
---|---|
Tests | 3 |
Failures | 0 |
Duration | 1.1 sec |
---|---|
Tests | 1 |
Failures | 1 |
Test case: | [dev-tools/py3-mpi4py] python hello world (slurm/gnu14/openmpi5) |
---|---|
Outcome: | Failed |
Duration: | 1.1 sec |
Failed | None |
(from function `run_mpi_binary' in file ./common/functions, line 414, in test file hipy, line 35) `run_mpi_binary "${_python} helloworld.py" $ARGS $NODES $TASKS' failed job script = /tmp/job.ohpc-test.6893 Batch job 25 submitted Job 25 failed... Reason=NonZeroExitCode [prun] Master compute host = c1 [prun] Resource manager = slurm [prun] Launch cmd = mpirun -- python3 helloworld.py 8 (family=openmpi5)
Duration | 1.492 sec |
---|---|
Tests | 4 |
Failures | 0 |
Duration | 0.036 sec |
---|---|
Tests | 4 |
Failures | 0 |
Test case: | [Numpy] Verify NUMPY modules can be loaded and match rpm version (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.008 sec |
Failed | None |
None
Test case: | [Numpy] Verify module NUMPY_DIR is defined and exists (gnu14) |
---|---|
Outcome: | Passed |
Duration: | 0.01 sec |
Failed | None |
None
Duration | 0.028 sec |
---|---|
Tests | 3 |
Failures | 0 |
Duration | 0.028 sec |
---|---|
Tests | 3 |
Failures | 0 |
Duration | 4.154 sec |
---|---|
Tests | 4 |
Failures | 0 |
Test case: | [/dev-tools/autotools] running autoreconf |
---|---|
Outcome: | Passed |
Duration: | 2.741 sec |
Failed | None |
None
Test case: | [/dev-tools/autotools] run generated configure |
---|---|
Outcome: | Passed |
Duration: | 1.312 sec |
Failed | None |
None
Duration | 0.706 sec |
---|---|
Tests | 3 |
Failures | 0 |
Duration | 1.114 sec |
---|---|
Tests | 4 |
Failures | 0 |
Test case: | [BOS] OS distribution matches (2 active computes) |
---|---|
Outcome: | Skipped |
Duration: | 0.0 sec |
Failed | None |
Skipped | None |
None
disable BOS_RELEASE check
Test case: | [BOS] consistent kernel (2 active computes) |
---|---|
Outcome: | Passed |
Duration: | 0.556 sec |
Failed | None |
None
Duration | 1.401 sec |
---|---|
Tests | 4 |
Failures | 0 |
Duration | 0.03 sec |
---|---|
Tests | 6 |
Failures | 0 |
Test case: | [OOB] ipmitool local bmc ping |
---|---|
Outcome: | Skipped |
Duration: | 0.0 sec |
Failed | None |
Skipped | None |
None
Skipping local bmc ping for ARCH=aarch64
Test case: | [OOB] ipmitool power status |
---|---|
Outcome: | Skipped |
Duration: | 0.0 sec |
Failed | None |
Skipped | None |
None
Skipping local power status for ARCH=aarch64