Test Report : results

Test Suite: InstallTests.computes_installed-20241021183048

Results

Duration0.541 sec
Tests5
Failures0

Tests

InstallTests.computes_installed

Test case:test_01_numcomputes_greater_than_zero
Outcome:Passed
Duration:0.0 sec
FailedNone
None
Test case:test_02_koomie_cf_available
Outcome:Passed
Duration:0.0 sec
FailedNone
None
Test case:test_03_nonzero_results_from_uptime
Outcome:Passed
Duration:0.541 sec
FailedNone
None
Test case:test_04_correct_number_of_hosts_booted
Outcome:Passed
Duration:0.0 sec
FailedNone
None
Test case:test_05_verify_boot_times_are_reasonable
Outcome:Passed
Duration:0.0 sec
FailedNone
None

Test Suite: sms_installed.bats

Results

Duration0.025 sec
Tests2
Failures0

Tests

sms_installed.bats

Test case:Verify hostname matches expectations
Outcome:Passed
Duration:0.011 sec
FailedNone
None
Test case:Base OS check
Outcome:Passed
Duration:0.014 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.525 sec
Tests4
Failures2

Tests

rm_execution

Test case:[modules] env variable passes through ()
Outcome:Passed
Duration:0.376 sec
FailedNone
None
Test case:[modules] loaded module passes through ()
Outcome:Failed
Duration:0.449 sec
FailedNone
(from function `assert_success' in file ../common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 36)
  `assert_success' failed

-- command failed --
status : 1
output : srun: error: c1: task 0: Exited with exit code 1
--
Test case:[modules] module commands available in RMS job ()
Outcome:Failed
Duration:0.33 sec
FailedNone
(from function `assert_success' in file ../common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 44)
  `assert_success' failed

-- command failed --
status : 1
output (2 lines):
  srun: error: c1: task 0: Exited with exit code 1
  environment: line 17: /opt/ohpc/admin/lmod/lmod/libexec/lmod: No such file or directory
--
Test case:[modules] module load propagates thru RMS ()
Outcome:Passed
Duration:0.37 sec
FailedNone
None

Test Suite: interactive_commands

Results

Duration4.181 sec
Tests8
Failures0

Tests

interactive_commands

Test case:[modules] module purge
Outcome:Passed
Duration:0.126 sec
FailedNone
None
Test case:[modules] module list
Outcome:Passed
Duration:0.19 sec
FailedNone
None
Test case:[modules] module help
Outcome:Passed
Duration:0.208 sec
FailedNone
None
Test case:[modules] module load/unload
Outcome:Passed
Duration:2.052 sec
FailedNone
None
Test case:[modules] module whatis
Outcome:Passed
Duration:0.209 sec
FailedNone
None
Test case:[modules] module swap
Outcome:Passed
Duration:0.419 sec
FailedNone
None
Test case:[modules] path updated
Outcome:Passed
Duration:0.568 sec
FailedNone
None
Test case:[modules] module depends-on
Outcome:Passed
Duration:0.409 sec
FailedNone
None

Test Suite: lmod_installed

Results

Duration0.021 sec
Tests1
Failures0

Tests

lmod_installed

Test case:[modules] Check if lmod RPM installed
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration3.236 sec
Tests2
Failures2

Tests

rm_execution

Test case:[libs/ADIOS2] MPI C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.133 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 24)
  `run_mpi_binary ./${binary} "" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.2397
Batch job 31 submitted

Job 31 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./arrays_write  (family=openmpi5)
Test case:[libs/ADIOS2] MPI F90 binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.103 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 34)
  `run_mpi_binary ./${binary} "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.26150
Batch job 32 submitted

Job 32 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./scalars_write  (family=openmpi5)

Test Suite: test_module

Results

Duration0.184 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/ADIOS2] Verify ADIOS2 module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.125 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_BIN is defined and exists
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify (dynamic) library available in ADIOS2_LIB (gnu14)
Outcome:Passed
Duration:0.012 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify header file is present in ADIOS2_INC (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration295.251 sec
Tests2
Failures2

Tests

rm_execution

Test case:[libs/ADIOS2] MPI C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:145.044 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 24)
  `run_mpi_binary ./${binary} "" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.140
Batch job 29 submitted

Job 29 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./arrays_write  (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 29 ON c1 CANCELLED AT 2024-10-21T19:03:02 DUE TO TIME LIMIT ***
Test case:[libs/ADIOS2] MPI F90 binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:150.207 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 34)
  `run_mpi_binary ./${binary} "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.14151
Batch job 30 submitted

Job 30 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./scalars_write  (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 30 ON c1 CANCELLED AT 2024-10-21T19:05:33 DUE TO TIME LIMIT ***

Test Suite: test_module

Results

Duration0.178 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/ADIOS2] Verify ADIOS2 module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.12 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_BIN is defined and exists
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify (dynamic) library available in ADIOS2_LIB (gnu14)
Outcome:Passed
Duration:0.011 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify header file is present in ADIOS2_INC (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration4.125 sec
Tests11
Failures0

Tests

rm_execution

Test case:[libs/openblas/dblat1] dblat1 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.154 sec
FailedNone
None
Test case:[libs/openblas/xccblat1] xccblat1 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.211 sec
FailedNone
None
Test case:[libs/openblas/xzcblat1] xzcblat1 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.175 sec
FailedNone
None
Test case:[libs/openblas/xscblat2] xscblat2 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.43 sec
FailedNone
None
Test case:[libs/openblas/xdcblat2] xdcblat2 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.407 sec
FailedNone
None
Test case:[libs/openblas/xccblat2] xccblat2 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.581 sec
FailedNone
None
Test case:[libs/openblas/xzcblat2] xzcblat2 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.58 sec
FailedNone
None
Test case:[libs/openblas/xscblat3] xscblat3 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.225 sec
FailedNone
None
Test case:[libs/openblas/xdcblat3] xdcblat3 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.265 sec
FailedNone
None
Test case:[libs/openblas/xccblat3] xccblat3 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.289 sec
FailedNone
None
Test case:[libs/openblas/xzcblat3] xzcblat3 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.808 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.134 sec
Tests5
Failures0

Tests

test_module

Test case:[libs/OpenBLAS] Verify OPENBLAS module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.098 sec
FailedNone
None
Test case:[libs/OpenBLAS] Verify module OPENBLAS_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/OpenBLAS] Verify module OPENBLAS_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[libs/OpenBLAS] Verify dynamic library available in OPENBLAS_LIB (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/OpenBLAS] Verify static library is not present in OPENBLAS_LIB (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: test_lapack

Results

Duration0.451 sec
Tests1
Failures0

Tests

test_lapack

Test case:[libs/OpenBLAS/eigen] run lapack eigen-value solver (gnu14)
Outcome:Passed
Duration:0.451 sec
FailedNone
None

Test Suite: test_pnetcdf

Results

Duration3.242 sec
Tests2
Failures2

Tests

test_pnetcdf

Test case:[libs/NetCDF] C parallel I/O (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.106 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file test_pnetcdf, line 28)
  `run_mpi_binary ./C_parallel "atest" 2 4' failed
job script = /tmp/job.ohpc-test.11850
Batch job 205 submitted

Job 205 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./C_parallel atest (family=openmpi5)
Test case:[libs/NetCDF] Fortran parallel I/O (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.136 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file test_pnetcdf, line 64)
  `run_mpi_binary -t "00:02:00" ./F90_parallel "atest" 2 4' failed
job script = /tmp/job.ohpc-test.10674
Batch job 206 submitted

Job 206 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./F90_parallel atest (family=openmpi5)

Test Suite: test_pnetcdf

Results

Duration298.538 sec
Tests2
Failures2

Tests

test_pnetcdf

Test case:[libs/NetCDF] C parallel I/O (slurm/gnu14/mpich)
Outcome:Failed
Duration:149.225 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file test_pnetcdf, line 28)
  `run_mpi_binary ./C_parallel "atest" 2 4' failed
job script = /tmp/job.ohpc-test.428
Batch job 200 submitted

Job 200 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./C_parallel atest (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 200 ON c1 CANCELLED AT 2024-10-21T20:42:33 DUE TO TIME LIMIT ***
Test case:[libs/NetCDF] Fortran parallel I/O (slurm/gnu14/mpich)
Outcome:Failed
Duration:149.313 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file test_pnetcdf, line 64)
  `run_mpi_binary -t "00:02:00" ./F90_parallel "atest" 2 4' failed
job script = /tmp/job.ohpc-test.26604
Batch job 201 submitted

Job 201 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./F90_parallel atest (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 201 ON c1 CANCELLED AT 2024-10-21T20:45:02 DUE TO TIME LIMIT ***

Test Suite: test_Fortran_module

Results

Duration0.21 sec
Tests9
Failures0

Tests

test_Fortran_module

Test case:[libs/NetCDF-Fortran] Verify NETCDF_FORTRAN module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.12 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.011 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify availability of nf-config binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify dynamic library available in NETCDF_FORTRAN_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify static library is not present in NETCDF_FORTRAN_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify header file is present in NETCDF_FORTRAN_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: test_CXX_module

Results

Duration0.21 sec
Tests9
Failures0

Tests

test_CXX_module

Test case:[libs/NetCDF-CXX] Verify NETCDF_CXX module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.12 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.011 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.011 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify availability of ncxx4-config binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify dynamic library available in NETCDF_CXX_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify static library is not present in NETCDF_CXX_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify header file is present in NETCDF_CXX_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: test_netcdf

Results

Duration0.859 sec
Tests7
Failures3

Tests

test_netcdf

Test case:[libs/NetCDF] ncdump availability (gnu14/openmpi5)
Outcome:Passed
Duration:0.078 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4/hdf5 available for C interface (gnu14/openmpi5)
Outcome:Passed
Duration:0.027 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for Fortran interface (gnu14/openmpi5)
Outcome:Passed
Duration:0.018 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for C++ interface (gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
option no longer supported
Test case:[libs/NetCDF] C write/read (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.286 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file test_netcdf, line 64)
  `assert_success' failed

-- command failed --
status : 127
output (2 lines):
  /home/ohpc-test/tests/libs/netcdf/tests/./C_write: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
  srun: error: c1: task 0: Exited with exit code 127
--
Test case:[libs/NetCDF] Fortran write/read (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.199 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file test_netcdf, line 105)
  `assert_success' failed

-- command failed --
status : 127
output (2 lines):
  /home/ohpc-test/tests/libs/netcdf/tests/./F90_write: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
  srun: error: c1: task 0: Exited with exit code 127
--
Test case:[libs/NetCDF] C++ write/read (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.251 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file test_netcdf, line 147)
  `assert_success' failed

-- command failed --
status : 127
output (2 lines):
  /home/ohpc-test/tests/libs/netcdf/tests/./CXX_write: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
  srun: error: c1: task 0: Exited with exit code 127
--

Test Suite: test_C_module

Results

Duration0.205 sec
Tests9
Failures0

Tests

test_C_module

Test case:[libs/NetCDF] Verify NETCDF module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.117 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/NetCDF] Verify availability of nc-config binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF] Verify dynamic library available in NETCDF_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF] Verify static library is not present in NETCDF_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF] Verify header file is present in NETCDF_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: test_Fortran_module

Results

Duration0.207 sec
Tests9
Failures0

Tests

test_Fortran_module

Test case:[libs/NetCDF-Fortran] Verify NETCDF_FORTRAN module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.12 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify availability of nf-config binary (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify dynamic library available in NETCDF_FORTRAN_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify static library is not present in NETCDF_FORTRAN_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify header file is present in NETCDF_FORTRAN_INC (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: test_CXX_module

Results

Duration0.201 sec
Tests9
Failures0

Tests

test_CXX_module

Test case:[libs/NetCDF-CXX] Verify NETCDF_CXX module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.114 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify availability of ncxx4-config binary (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify dynamic library available in NETCDF_CXX_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify static library is not present in NETCDF_CXX_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify header file is present in NETCDF_CXX_INC (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: test_netcdf

Results

Duration0.779 sec
Tests7
Failures3

Tests

test_netcdf

Test case:[libs/NetCDF] ncdump availability (gnu14/mpich)
Outcome:Passed
Duration:0.038 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4/hdf5 available for C interface (gnu14/mpich)
Outcome:Passed
Duration:0.026 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for Fortran interface (gnu14/mpich)
Outcome:Passed
Duration:0.018 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for C++ interface (gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
option no longer supported
Test case:[libs/NetCDF] C write/read (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.237 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file test_netcdf, line 64)
  `assert_success' failed

-- command failed --
status : 127
output (2 lines):
  /home/ohpc-test/tests/libs/netcdf/tests/./C_write: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
  srun: error: c1: task 0: Exited with exit code 127
--
Test case:[libs/NetCDF] Fortran write/read (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.222 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file test_netcdf, line 105)
  `assert_success' failed

-- command failed --
status : 127
output (2 lines):
  /home/ohpc-test/tests/libs/netcdf/tests/./F90_write: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
  srun: error: c1: task 0: Exited with exit code 127
--
Test case:[libs/NetCDF] C++ write/read (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.238 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file test_netcdf, line 147)
  `assert_success' failed

-- command failed --
status : 127
output (2 lines):
  /home/ohpc-test/tests/libs/netcdf/tests/./CXX_write: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
  srun: error: c1: task 0: Exited with exit code 127
--

Test Suite: test_C_module

Results

Duration0.199 sec
Tests9
Failures0

Tests

test_C_module

Test case:[libs/NetCDF] Verify NETCDF module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.111 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/NetCDF] Verify availability of nc-config binary (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF] Verify dynamic library available in NETCDF_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF] Verify static library is not present in NETCDF_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF] Verify header file is present in NETCDF_INC (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: test_Fortran_module

Results

Duration0.193 sec
Tests9
Failures0

Tests

test_Fortran_module

Test case:[libs/NetCDF-Fortran] Verify NETCDF_FORTRAN module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.107 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_BIN is defined and exists (gnu14)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify availability of nf-config binary (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify dynamic library available in NETCDF_FORTRAN_LIB (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify static library is not present in NETCDF_FORTRAN_LIB (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify header file is present in NETCDF_FORTRAN_INC (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: test_CXX_module

Results

Duration0.195 sec
Tests9
Failures0

Tests

test_CXX_module

Test case:[libs/NetCDF-CXX] Verify NETCDF_CXX module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.108 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_BIN is defined and exists (gnu14)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify availability of ncxx4-config binary (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify dynamic library available in NETCDF_CXX_LIB (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify static library is not present in NETCDF_CXX_LIB (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify header file is present in NETCDF_CXX_INC (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: test_netcdf

Results

Duration1.572 sec
Tests7
Failures0

Tests

test_netcdf

Test case:[libs/NetCDF] ncdump availability (gnu14)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4/hdf5 available for C interface (gnu14)
Outcome:Passed
Duration:0.026 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for Fortran interface (gnu14)
Outcome:Passed
Duration:0.017 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for C++ interface (gnu14)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
option no longer supported
Test case:[libs/NetCDF] C write/read (slurm/gnu14)
Outcome:Passed
Duration:0.459 sec
FailedNone
None
Test case:[libs/NetCDF] Fortran write/read (slurm/gnu14)
Outcome:Passed
Duration:0.421 sec
FailedNone
None
Test case:[libs/NetCDF] C++ write/read (slurm/gnu14)
Outcome:Passed
Duration:0.616 sec
FailedNone
None

Test Suite: test_C_module

Results

Duration0.191 sec
Tests9
Failures0

Tests

test_C_module

Test case:[libs/NetCDF] Verify NETCDF module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.105 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_BIN is defined and exists (gnu14)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/NetCDF] Verify availability of nc-config binary (gnu14)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[libs/NetCDF] Verify dynamic library available in NETCDF_LIB (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF] Verify static library is not present in NETCDF_LIB (gnu14)
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/NetCDF] Verify header file is present in NETCDF_INC (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.185 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/Metis] Verify METIS module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.099 sec
FailedNone
None
Test case:[libs/Metis] Verify module METIS_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/Metis] Verify module METIS_BIN is defined and exists (gnu14)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/Metis] Verify availability of m2gmetis binary (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/Metis] Verify module METIS_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/Metis] Verify dynamic library available in METIS_LIB (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/Metis] Verify static library is not present in METIS_LIB (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/Metis] Verify module METIS_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/Metis] Verify header file is present in METIS_INC (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: test_metis

Results

Duration4.111 sec
Tests4
Failures0

Tests

test_metis

Test case:[libs/Metis] Graph partition (gnu14)
Outcome:Passed
Duration:0.409 sec
FailedNone
None
Test case:[libs/Metis] Fill-reducing ordering (gnu14)
Outcome:Passed
Duration:3.27 sec
FailedNone
None
Test case:[libs/Metis] Mesh to graph conversion (gnu14)
Outcome:Passed
Duration:0.03 sec
FailedNone
None
Test case:[libs/Metis] C API mesh partitioning (slurm/gnu14)
Outcome:Passed
Duration:0.402 sec
FailedNone
None

Test Suite: sms_execution

Results

Duration1.363 sec
Tests1
Failures0

Tests

sms_execution

Test case:[libs/OpenCoarrays] hello_multiverse binary runs on head node (gnu14/openmpi5)
Outcome:Passed
Duration:1.363 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration2.175 sec
Tests2
Failures1

Tests

rm_execution

Test case:[libs/OpenCoarrays] hello_multiverse binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.175 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 39)
  `assert_success' failed

-- command failed --
status : 1
output (9 lines):
  job script = /tmp/job.ohpc-test.28415
  Batch job 225 submitted

  Job 225 failed...
  Reason=NonZeroExitCode

  [prun] Master compute host = c1
  [prun] Resource manager = slurm
  [prun] Launch cmd = mpirun -- ./hello  (family=openmpi5)
--
Test case:[libs/OpenCoarrays] hello_multiverse binary runs under resource manager using cafrun script (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
cafrun not supported for all MPI variants

Test Suite: test_module

Results

Duration0.183 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/opencoarrays] Verify opencoarrays module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.11 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify dynamic library available in OPENCOARRAYS_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify static library is not present in OPENCOARRAYS_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify header file is present in OPENCOARRAYS_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify F90 module is present in OPENCOARRAYS_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: sms_execution

Results

Duration0.266 sec
Tests1
Failures0

Tests

sms_execution

Test case:[libs/OpenCoarrays] hello_multiverse binary runs on head node (gnu14/mpich)
Outcome:Passed
Duration:0.266 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration122.67 sec
Tests2
Failures1

Tests

rm_execution

Test case:[libs/OpenCoarrays] hello_multiverse binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:122.67 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 39)
  `assert_success' failed

-- command failed --
status : 1
output (12 lines):
  job script = /tmp/job.ohpc-test.13692
  Batch job 224 submitted

  Job 224 encountered timeout...
  Reason=TimeLimit

  [prun] Master compute host = c1
  [prun] Resource manager = slurm
  [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./hello  (family=mpich)
  /bin/srun: unrecognized option '--external-launcher'
  Try "srun --help" for more information
  slurmstepd-c1: error: *** JOB 224 ON c1 CANCELLED AT 2024-10-21T20:48:33 DUE TO TIME LIMIT ***
--
Test case:[libs/OpenCoarrays] hello_multiverse binary runs under resource manager using cafrun script (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
cafrun not supported for all MPI variants

Test Suite: test_module

Results

Duration0.179 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/opencoarrays] Verify opencoarrays module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.106 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify dynamic library available in OPENCOARRAYS_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify static library is not present in OPENCOARRAYS_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify header file is present in OPENCOARRAYS_INC (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify F90 module is present in OPENCOARRAYS_INC (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration12.72 sec
Tests5
Failures5

Tests

rm_execution

Test case:[libs/Mumps] C (double precision) runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.103 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 26)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.23020
Batch job 186 submitted

Job 186 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./C_double null (family=openmpi5)
Test case:[libs/Mumps] Fortran (single precision) runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.161 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 36)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.11024
Batch job 187 submitted

Job 187 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./F_single null (family=openmpi5)
Test case:[libs/Mumps] Fortran (double precision) runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.162 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 46)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.26931
Batch job 188 submitted

Job 188 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./F_double null (family=openmpi5)
Test case:[libs/Mumps] Fortran (complex) runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.161 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 56)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.32198
Batch job 189 submitted

Job 189 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./F_complex null (family=openmpi5)
Test case:[libs/Mumps] Fortran (double complex) runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.133 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 66)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.26352
Batch job 190 submitted

Job 190 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./F_doublecomplex null (family=openmpi5)

Test Suite: test_module

Results

Duration0.127 sec
Tests2
Failures0

Tests

test_module

Test case:[libs/MUMPS] Verify MUMPS module is loaded and matches rpm version (gnu14-openmpi5)
Outcome:Passed
Duration:0.117 sec
FailedNone
None
Test case:[libs/MUMPS] Verify MUMPS_DIR is defined and directory exists (gnu14-openmpi5)
Outcome:Passed
Duration:0.01 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration746.686 sec
Tests5
Failures5

Tests

rm_execution

Test case:[libs/Mumps] C (double precision) runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:147.071 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 26)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.11212
Batch job 181 submitted

Job 181 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./C_double null (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 181 ON c1 CANCELLED AT 2024-10-21T20:26:33 DUE TO TIME LIMIT ***
Test case:[libs/Mumps] Fortran (single precision) runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:150.169 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 36)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.1579
Batch job 182 submitted

Job 182 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./F_single null (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 182 ON c1 CANCELLED AT 2024-10-21T20:29:03 DUE TO TIME LIMIT ***
Test case:[libs/Mumps] Fortran (double precision) runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:148.108 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 46)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.4781
Batch job 183 submitted

Job 183 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./F_double null (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 183 ON c1 CANCELLED AT 2024-10-21T20:31:32 DUE TO TIME LIMIT ***
Test case:[libs/Mumps] Fortran (complex) runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:150.167 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 56)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.9503
Batch job 184 submitted

Job 184 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./F_complex null (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 184 ON c1 CANCELLED AT 2024-10-21T20:34:02 DUE TO TIME LIMIT ***
Test case:[libs/Mumps] Fortran (double complex) runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:151.171 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 66)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.5311
Batch job 185 submitted

Job 185 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./F_doublecomplex null (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 185 ON c1 CANCELLED AT 2024-10-21T20:36:33 DUE TO TIME LIMIT ***

Test Suite: test_module

Results

Duration0.12 sec
Tests2
Failures0

Tests

test_module

Test case:[libs/MUMPS] Verify MUMPS module is loaded and matches rpm version (gnu14-mpich)
Outcome:Passed
Duration:0.111 sec
FailedNone
None
Test case:[libs/MUMPS] Verify MUMPS_DIR is defined and directory exists (gnu14-mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.594 sec
Tests3
Failures3

Tests

rm_execution

Test case:[libs/FFTW] Serial C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.231 sec
FailedNone
(from function `assert_success' in file ../../../common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 26)
  `assert_success' failed

-- command failed --
status : 127
output (2 lines):
  /home/ohpc-test/tests/libs/fftw/tests/./C_test: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
  srun: error: c1: task 0: Exited with exit code 127
--
Test case:[libs/FFTW] MPI C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.142 sec
FailedNone
(from function `assert_success' in file ../../../common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 40)
  `assert_success' failed

-- command failed --
status : 1
output (9 lines):
  job script = /tmp/job.ohpc-test.3073
  Batch job 101 submitted

  Job 101 failed...
  Reason=NonZeroExitCode

  [prun] Master compute host = c1
  [prun] Resource manager = slurm
  [prun] Launch cmd = mpirun -- ./C_mpi_test 8 (family=openmpi5)
--
Test case:[libs/FFTW] Serial Fortran binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.221 sec
FailedNone
(from function `assert_success' in file ../../../common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 49)
  `assert_success' failed

-- command failed --
status : 127
output (2 lines):
  /home/ohpc-test/tests/libs/fftw/tests/./F_test: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
  srun: error: c1: task 0: Exited with exit code 127
--

Test Suite: test_module

Results

Duration0.162 sec
Tests7
Failures0

Tests

test_module

Test case:[FFTW] Verify FFTW module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.108 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[FFTW] Verify dynamic library available in FFTW_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[FFTW] Verify static library is not present in FFTW_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[FFTW] Verify header file is present in FFTW_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.159 sec
Tests7
Failures0

Tests

test_module

Test case:[FFTW] Verify FFTW module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.105 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[FFTW] Verify dynamic library available in FFTW_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[FFTW] Verify static library is not present in FFTW_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[FFTW] Verify header file is present in FFTW_INC (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration324.363 sec
Tests3
Failures3

Tests

rm_execution

Test case:[libs/FFTW] Serial C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.205 sec
FailedNone
(from function `assert_success' in file ../../../common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 26)
  `assert_success' failed

-- command failed --
status : 127
output (2 lines):
  /home/ohpc-test/tests/libs/fftw/tests/./C_test: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
  srun: error: c1: task 0: Exited with exit code 127
--
Test case:[libs/FFTW] MPI C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:323.918 sec
FailedNone
(from function `assert_success' in file ../../../common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 40)
  `assert_success' failed

-- command failed --
status : 1
output (12 lines):
  job script = /tmp/job.ohpc-test.25346
  Batch job 98 submitted

  Job 98 encountered timeout...
  Reason=TimeLimit

  [prun] Master compute host = c1
  [prun] Resource manager = slurm
  [prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./C_mpi_test 8 (family=mpich)
  /bin/srun: unrecognized option '--external-launcher'
  Try "srun --help" for more information
  slurmstepd-c1: error: *** JOB 98 ON c1 CANCELLED AT 2024-10-21T19:38:32 DUE TO TIME LIMIT ***
--
Test case:[libs/FFTW] Serial Fortran binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.24 sec
FailedNone
(from function `assert_success' in file ../../../common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 49)
  `assert_success' failed

-- command failed --
status : 127
output (2 lines):
  /home/ohpc-test/tests/libs/fftw/tests/./F_test: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
  srun: error: c1: task 0: Exited with exit code 127
--

Test Suite: rm_execution

Results

Duration15.852 sec
Tests6
Failures6

Tests

rm_execution

Test case:[Boost/MPI] all_gather_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.1 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 21)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.27201
Batch job 91 submitted

Job 91 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./all_gather_test atest (family=openmpi5)
Test case:[Boost/MPI] all_reduce_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.156 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 30)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.26579
Batch job 92 submitted

Job 92 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./all_reduce_test atest (family=openmpi5)
Test case:[Boost/MPI] all_to_all_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.129 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 39)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.2624
Batch job 93 submitted

Job 93 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./all_to_all_test atest (family=openmpi5)
Test case:[Boost/MPI] groups_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.155 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 48)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.7081
Batch job 94 submitted

Job 94 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./groups_test atest (family=openmpi5)
Test case:[Boost/MPI] ring_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.156 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 66)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.3988
Batch job 95 submitted

Job 95 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./ring_test atest (family=openmpi5)
Test case:[Boost/MPI] pointer_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.156 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 75)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.31550
Batch job 96 submitted

Job 96 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./pointer_test atest (family=openmpi5)

Test Suite: rm_execution

Results

Duration898.074 sec
Tests6
Failures6

Tests

rm_execution

Test case:[Boost/MPI] all_gather_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:147.97 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 21)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.29089
Batch job 85 submitted

Job 85 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./all_gather_test atest (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 85 ON c1 CANCELLED AT 2024-10-21T19:19:02 DUE TO TIME LIMIT ***
Test case:[Boost/MPI] all_reduce_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:150.008 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 30)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.21518
Batch job 86 submitted

Job 86 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./all_reduce_test atest (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 86 ON c1 CANCELLED AT 2024-10-21T19:21:32 DUE TO TIME LIMIT ***
Test case:[Boost/MPI] all_to_all_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:151.1 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 39)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.11032
Batch job 87 submitted

Job 87 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./all_to_all_test atest (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 87 ON c1 CANCELLED AT 2024-10-21T19:24:03 DUE TO TIME LIMIT ***
Test case:[Boost/MPI] groups_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:149.022 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 48)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.30766
Batch job 88 submitted

Job 88 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./groups_test atest (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 88 ON c1 CANCELLED AT 2024-10-21T19:26:32 DUE TO TIME LIMIT ***
Test case:[Boost/MPI] ring_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:149.988 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 66)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.26987
Batch job 89 submitted

Job 89 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ring_test atest (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 89 ON c1 CANCELLED AT 2024-10-21T19:29:02 DUE TO TIME LIMIT ***
Test case:[Boost/MPI] pointer_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:149.986 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 75)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.32694
Batch job 90 submitted

Job 90 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./pointer_test atest (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 90 ON c1 CANCELLED AT 2024-10-21T19:31:32 DUE TO TIME LIMIT ***

Test Suite: test_module

Results

Duration0.182 sec
Tests7
Failures0

Tests

test_module

Test case:[BOOST] Verify BOOST module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.127 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[BOOST] Verify dynamic library available in BOOST_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[BOOST] Verify static library is not present in BOOST_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[BOOST] Verify header file is present in BOOST_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.181 sec
Tests7
Failures0

Tests

test_module

Test case:[BOOST] Verify BOOST module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.125 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.011 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[BOOST] Verify dynamic library available in BOOST_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[BOOST] Verify static library is not present in BOOST_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[BOOST] Verify header file is present in BOOST_INC (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration26.416 sec
Tests12
Failures9

Tests

rm_execution

Test case:[libs/HYPRE] 2 PE structured test binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.134 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 23)
  `run_mpi_binary ./ex1 $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.5830
Batch job 162 submitted

Job 162 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./ex1 8 (family=openmpi5)
Test case:[libs/HYPRE] 2 PE PCG with SMG preconditioner binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.134 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 32)
  `run_mpi_binary ./ex2 $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.25951
Batch job 163 submitted

Job 163 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./ex2 8 (family=openmpi5)
Test case:[libs/HYPRE] 2 PE Semi-Structured PCG binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:4.193 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 41)
  `run_mpi_binary ./ex6 "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.17622
Batch job 164 submitted

Job 164 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./ex6  (family=openmpi5)
Test case:[libs/HYPRE] 2 PE Three-part stencil binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.163 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 50)
  `run_mpi_binary ./ex8 "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.25280
Batch job 165 submitted

Job 165 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./ex8  (family=openmpi5)
Test case:[libs/HYPRE] 2 PE FORTRAN PCG with PFMG preconditioner binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.135 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 59)
  `run_mpi_binary ./ex12f "" 1 2' failed
job script = /tmp/job.ohpc-test.12790
Batch job 166 submitted

Job 166 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./ex12f  (family=openmpi5)
./ex12f: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
./ex12f: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
--------------------------------------------------------------------------
prterun detected that one or more processes exited with non-zero status,
thus causing the job to be terminated. The first process to do so was:

   Process name: [prterun-c1-5772@1,0] Exit code:    127
--------------------------------------------------------------------------
Test case:[libs/HYPRE] 2 PE -Delta u = 1 binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:4.193 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 68)
  `run_mpi_binary ./ex3 "-n 33 -solver 0 -v 1 1" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.26307
Batch job 167 submitted

Job 167 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./ex3 -n 33 -solver 0 -v 1 1 (family=openmpi5)
Test case:[libs/HYPRE] 2 PE convection-reaction-diffusion binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.105 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 77)
  `run_mpi_binary ./ex4 "-n 33 -solver 10  -K 3 -B 0 -C 1 -U0 2 -F 4" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.15687
Batch job 168 submitted

Job 168 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./ex4 -n 33 -solver 10 -K 3 -B 0 -C 1 -U0 2 -F 4 (family=openmpi5)
Test case:[libs/HYPRE] 2 PE FORTRAN 2-D Laplacian binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:4.194 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 86)
  `run_mpi_binary ./ex5f "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.3309
Batch job 169 submitted

Job 169 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./ex5f  (family=openmpi5)
Test case:[libs/HYPRE] 2 PE Semi-Structured convection binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[libs/HYPRE] 2 PE biharmonic binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[libs/HYPRE] 2 PE C++ Finite Element Interface binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
C++ example depends on non-installed header
Test case:[libs/HYPRE] 2 PE 2-D Laplacian eigenvalue binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.165 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 125)
  `run_mpi_binary ./ex11 "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.12946
Batch job 170 submitted

Job 170 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./ex11  (family=openmpi5)

Test Suite: test_module

Results

Duration1.288 sec
Tests9
Failures1

Tests

test_module

Test case:[libs/HYPRE] Verify HYPRE module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.119 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.011 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/HYPRE] Verify dynamic library available in HYPRE_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/HYPRE] Verify static library is not present in HYPRE_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/HYPRE] Verify header file is present in HYPRE_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/HYPRE] Sample job (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.103 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file test_module, line 131)
  `run_mpi_binary ./ex1 "atest" 1 1' failed
job script = /tmp/job.ohpc-test.1390
Batch job 161 submitted

Job 161 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./ex1 atest (family=openmpi5)
./ex1: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory

Test Suite: rm_execution

Results

Duration1347.965 sec
Tests12
Failures9

Tests

rm_execution

Test case:[libs/HYPRE] 2 PE structured test binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:149.206 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 23)
  `run_mpi_binary ./ex1 $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.12572
Batch job 152 submitted

Job 152 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex1 8 (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 152 ON c1 CANCELLED AT 2024-10-21T19:47:33 DUE TO TIME LIMIT ***
Test case:[libs/HYPRE] 2 PE PCG with SMG preconditioner binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:149.209 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 32)
  `run_mpi_binary ./ex2 $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.25600
Batch job 153 submitted

Job 153 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex2 8 (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 153 ON c1 CANCELLED AT 2024-10-21T19:50:03 DUE TO TIME LIMIT ***
Test case:[libs/HYPRE] 2 PE Semi-Structured PCG binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:150.235 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 41)
  `run_mpi_binary ./ex6 "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.7501
Batch job 154 submitted

Job 154 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex6  (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 154 ON c1 CANCELLED AT 2024-10-21T19:52:32 DUE TO TIME LIMIT ***
Test case:[libs/HYPRE] 2 PE Three-part stencil binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:150.236 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 50)
  `run_mpi_binary ./ex8 "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.3131
Batch job 155 submitted

Job 155 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex8  (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 155 ON c1 CANCELLED AT 2024-10-21T19:55:02 DUE TO TIME LIMIT ***
Test case:[libs/HYPRE] 2 PE FORTRAN PCG with PFMG preconditioner binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:151.287 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 59)
  `run_mpi_binary ./ex12f "" 1 2' failed
job script = /tmp/job.ohpc-test.31799
Batch job 156 submitted

Job 156 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex12f  (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 156 ON c1 CANCELLED AT 2024-10-21T19:57:33 DUE TO TIME LIMIT ***
Test case:[libs/HYPRE] 2 PE -Delta u = 1 binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:149.214 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 68)
  `run_mpi_binary ./ex3 "-n 33 -solver 0 -v 1 1" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.18389
Batch job 157 submitted

Job 157 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex3 -n 33 -solver 0 -v 1 1 (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 157 ON c1 CANCELLED AT 2024-10-21T20:00:03 DUE TO TIME LIMIT ***
Test case:[libs/HYPRE] 2 PE convection-reaction-diffusion binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:149.202 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 77)
  `run_mpi_binary ./ex4 "-n 33 -solver 10  -K 3 -B 0 -C 1 -U0 2 -F 4" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.9574
Batch job 158 submitted

Job 158 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex4 -n 33 -solver 10 -K 3 -B 0 -C 1 -U0 2 -F 4 (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 158 ON c1 CANCELLED AT 2024-10-21T20:02:33 DUE TO TIME LIMIT ***
Test case:[libs/HYPRE] 2 PE FORTRAN 2-D Laplacian binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:150.233 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 86)
  `run_mpi_binary ./ex5f "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.26466
Batch job 159 submitted

Job 159 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex5f  (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 159 ON c1 CANCELLED AT 2024-10-21T20:05:02 DUE TO TIME LIMIT ***
Test case:[libs/HYPRE] 2 PE Semi-Structured convection binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[libs/HYPRE] 2 PE biharmonic binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[libs/HYPRE] 2 PE C++ Finite Element Interface binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
C++ example depends on non-installed header
Test case:[libs/HYPRE] 2 PE 2-D Laplacian eigenvalue binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:149.143 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 125)
  `run_mpi_binary ./ex11 "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.26465
Batch job 160 submitted

Job 160 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex11  (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 160 ON c1 CANCELLED AT 2024-10-21T20:07:33 DUE TO TIME LIMIT ***

Test Suite: test_module

Results

Duration150.318 sec
Tests9
Failures1

Tests

test_module

Test case:[libs/HYPRE] Verify HYPRE module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.113 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.011 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/HYPRE] Verify dynamic library available in HYPRE_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/HYPRE] Verify static library is not present in HYPRE_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/HYPRE] Verify header file is present in HYPRE_INC (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/HYPRE] Sample job (slurm/gnu14/mpich)
Outcome:Failed
Duration:150.139 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file test_module, line 131)
  `run_mpi_binary ./ex1 "atest" 1 1' failed
job script = /tmp/job.ohpc-test.17804
Batch job 151 submitted

Job 151 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./ex1 atest (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 151 ON c1 CANCELLED AT 2024-10-21T19:45:03 DUE TO TIME LIMIT ***

Test Suite: rm_execution

Results

Duration1.718 sec
Tests11
Failures0

Tests

rm_execution

Test case:[Boost/Program Options] cmdline_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.192 sec
FailedNone
None
Test case:[Boost/Program Options] exception_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.207 sec
FailedNone
None
Test case:[Boost/Program Options] options_description_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.182 sec
FailedNone
None
Test case:[Boost/Program Options] parsers_test on master host(gnu14/openmpi5)
Outcome:Passed
Duration:0.013 sec
FailedNone
None
Test case:[Boost/Program Options] parsers_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.217 sec
FailedNone
None
Test case:[Boost/Program Options] positional_options_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.163 sec
FailedNone
None
Test case:[Boost/Program Options] required_test on master host(gnu14/openmpi5)
Outcome:Passed
Duration:0.013 sec
FailedNone
None
Test case:[Boost/Program Options] required_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.159 sec
FailedNone
None
Test case:[Boost/Program Options] unicode_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.193 sec
FailedNone
None
Test case:[Boost/Program Options] unrecognized_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.197 sec
FailedNone
None
Test case:[Boost/Program Options] variable_map_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.182 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.629 sec
Tests11
Failures0

Tests

rm_execution

Test case:[Boost/Program Options] cmdline_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.183 sec
FailedNone
None
Test case:[Boost/Program Options] exception_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.238 sec
FailedNone
None
Test case:[Boost/Program Options] options_description_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.162 sec
FailedNone
None
Test case:[Boost/Program Options] parsers_test on master host(gnu14/mpich)
Outcome:Passed
Duration:0.012 sec
FailedNone
None
Test case:[Boost/Program Options] parsers_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.103 sec
FailedNone
None
Test case:[Boost/Program Options] positional_options_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.2 sec
FailedNone
None
Test case:[Boost/Program Options] required_test on master host(gnu14/mpich)
Outcome:Passed
Duration:0.012 sec
FailedNone
None
Test case:[Boost/Program Options] required_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.178 sec
FailedNone
None
Test case:[Boost/Program Options] unicode_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.186 sec
FailedNone
None
Test case:[Boost/Program Options] unrecognized_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.157 sec
FailedNone
None
Test case:[Boost/Program Options] variable_map_test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.198 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.754 sec
Tests4
Failures0

Tests

rm_execution

Test case:[Boost/Accumulators] min-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.181 sec
FailedNone
None
Test case:[Boost/Accumulators] max-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.164 sec
FailedNone
None
Test case:[Boost/Accumulators] skewness-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.206 sec
FailedNone
None
Test case:[Boost/Accumulators] variance-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.203 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.798 sec
Tests4
Failures0

Tests

rm_execution

Test case:[Boost/Accumulators] min-test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.2 sec
FailedNone
None
Test case:[Boost/Accumulators] max-test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.209 sec
FailedNone
None
Test case:[Boost/Accumulators] skewness-test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.182 sec
FailedNone
None
Test case:[Boost/Accumulators] variance-test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.207 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration2.279 sec
Tests4
Failures0

Tests

rm_execution

Test case:[Boost/Random] test_piecewise_linear under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:1.501 sec
FailedNone
None
Test case:[Boost/Random] test_discrete under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.412 sec
FailedNone
None
Test case:[Boost/Random] test_random_device under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.177 sec
FailedNone
None
Test case:[Boost/Random] test_random_number_generator under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.189 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration2.271 sec
Tests4
Failures0

Tests

rm_execution

Test case:[Boost/Random] test_piecewise_linear under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:1.501 sec
FailedNone
None
Test case:[Boost/Random] test_discrete under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.4 sec
FailedNone
None
Test case:[Boost/Random] test_random_device under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.172 sec
FailedNone
None
Test case:[Boost/Random] test_random_number_generator under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.198 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.627 sec
Tests4
Failures0

Tests

rm_execution

Test case:[Boost/Multi Array] access-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.165 sec
FailedNone
None
Test case:[Boost/Multi Array] iterators-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.137 sec
FailedNone
None
Test case:[Boost/Multi Array] resize-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.178 sec
FailedNone
None
Test case:[Boost/Multi Array] idxgen1-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:0.147 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.705 sec
Tests4
Failures0

Tests

rm_execution

Test case:[Boost/Multi Array] access-test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.172 sec
FailedNone
None
Test case:[Boost/Multi Array] iterators-test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.164 sec
FailedNone
None
Test case:[Boost/Multi Array] resize-test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.183 sec
FailedNone
None
Test case:[Boost/Multi Array] idxgen1-test under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:0.186 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration2.151 sec
Tests1
Failures0

Tests

rm_execution

Test case:[Boost/ublas] bench1_ublas binary under resource manager (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:2.151 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration2.118 sec
Tests1
Failures0

Tests

rm_execution

Test case:[Boost/ublas] bench1_ublas binary under resource manager (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.118 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.554 sec
Tests3
Failures3

Tests

rm_execution

Test case:[Boost] bad_expression_test under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.178 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 21)
  `run_serial_binary ./$test' failed with status 127
/home/ohpc-test/tests/libs/boost/tests/regex/test/./bad_expression_test: error while loading shared libraries: libicudata.so.67: cannot open shared object file: No such file or directory
srun: error: c1: task 0: Exited with exit code 127
Test case:[Boost] named_subexpressions_test under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.208 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 30)
  `run_serial_binary ./$test' failed with status 127
/home/ohpc-test/tests/libs/boost/tests/regex/test/./named_subexpressions_test: error while loading shared libraries: libicudata.so.67: cannot open shared object file: No such file or directory
srun: error: c2: task 0: Exited with exit code 127
Test case:[Boost] recursion_test under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.168 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 39)
  `run_serial_binary ./$test config_test.cfg ' failed with status 127
/home/ohpc-test/tests/libs/boost/tests/regex/test/./recursion_test: error while loading shared libraries: libicudata.so.67: cannot open shared object file: No such file or directory
srun: error: c1: task 0: Exited with exit code 127

Test Suite: rm_execution

Results

Duration0.2 sec
Tests1
Failures1

Tests

rm_execution

Test case:[Boost] regress under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.2 sec
FailedNone
(from function `run_serial_binary' in file ../../../../../../common/functions, line 236,
 in test file rm_execution, line 21)
  `run_serial_binary ./$test' failed with status 127
/home/ohpc-test/tests/libs/boost/tests/regex/test/regress/./regress: error while loading shared libraries: libicudata.so.67: cannot open shared object file: No such file or directory
srun: error: c1: task 0: Exited with exit code 127

Test Suite: rm_execution

Results

Duration0.544 sec
Tests3
Failures3

Tests

rm_execution

Test case:[Boost] bad_expression_test under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.176 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 21)
  `run_serial_binary ./$test' failed with status 127
/home/ohpc-test/tests/libs/boost/tests/regex/test/./bad_expression_test: error while loading shared libraries: libicudata.so.67: cannot open shared object file: No such file or directory
srun: error: c1: task 0: Exited with exit code 127
Test case:[Boost] named_subexpressions_test under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.2 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 30)
  `run_serial_binary ./$test' failed with status 127
/home/ohpc-test/tests/libs/boost/tests/regex/test/./named_subexpressions_test: error while loading shared libraries: libicudata.so.67: cannot open shared object file: No such file or directory
srun: error: c2: task 0: Exited with exit code 127
Test case:[Boost] recursion_test under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.168 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 39)
  `run_serial_binary ./$test config_test.cfg ' failed with status 127
/home/ohpc-test/tests/libs/boost/tests/regex/test/./recursion_test: error while loading shared libraries: libicudata.so.67: cannot open shared object file: No such file or directory
srun: error: c1: task 0: Exited with exit code 127

Test Suite: test_module

Results

Duration4.505 sec
Tests8
Failures0

Tests

test_module

Test case:[BOOST] Verify BOOST module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.129 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.011 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[BOOST] Verify dynamic library available in BOOST_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[BOOST] Verify static library is not present in BOOST_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[BOOST] Verify header file is present in BOOST_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[BOOST] Test interactive build/exec of f_test.cpp (gnu14/openmpi5)
Outcome:Passed
Duration:4.32 sec
FailedNone
None

Test Suite: test_module

Results

Duration4.513 sec
Tests8
Failures0

Tests

test_module

Test case:[BOOST] Verify BOOST module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.123 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.011 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[BOOST] Verify dynamic library available in BOOST_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[BOOST] Verify static library is not present in BOOST_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[BOOST] Verify header file is present in BOOST_INC (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[BOOST] Test interactive build/exec of f_test.cpp (gnu14/mpich)
Outcome:Passed
Duration:4.334 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration79.896 sec
Tests50
Failures0

Tests

rm_execution

Test case:[libs/GSL] run test_gsl_histogram (gnu14)
Outcome:Passed
Duration:0.018 sec
FailedNone
None
Test case:[libs/GSL] run block under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.71 sec
FailedNone
None
Test case:[libs/GSL] run bspline under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.722 sec
FailedNone
None
Test case:[libs/GSL] run cblas under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.168 sec
FailedNone
None
Test case:[libs/GSL] run cdf under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.162 sec
FailedNone
None
Test case:[libs/GSL] run cheb under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.168 sec
FailedNone
None
Test case:[libs/GSL] run combination under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.152 sec
FailedNone
None
Test case:[libs/GSL] run complex under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.143 sec
FailedNone
None
Test case:[libs/GSL] run const under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.15 sec
FailedNone
None
Test case:[libs/GSL] run deriv under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.159 sec
FailedNone
None
Test case:[libs/GSL] run dht under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.287 sec
FailedNone
None
Test case:[libs/GSL] run diff under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.161 sec
FailedNone
None
Test case:[libs/GSL] run eigen under resource manager (slurm/gnu14)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
Skipping eigen test for ARCH=aarch64
Test case:[libs/GSL] run err under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.17 sec
FailedNone
None
Test case:[libs/GSL] run fft under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.599 sec
FailedNone
None
Test case:[libs/GSL] run fit under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.146 sec
FailedNone
None
Test case:[libs/GSL] run histogram under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.309 sec
FailedNone
None
Test case:[libs/GSL] run ieee-utils under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.175 sec
FailedNone
None
Test case:[libs/GSL] run integration under resource manager (slurm/gnu14)
Outcome:Passed
Duration:3.064 sec
FailedNone
None
Test case:[libs/GSL] run interpolation under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.186 sec
FailedNone
None
Test case:[libs/GSL] run linalg under resource manager (slurm/gnu14)
Outcome:Passed
Duration:1.839 sec
FailedNone
None
Test case:[libs/GSL] run matrix under resource manager (slurm/gnu14)
Outcome:Passed
Duration:2.728 sec
FailedNone
None
Test case:[libs/GSL] run min under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.178 sec
FailedNone
None
Test case:[libs/GSL] run monte under resource manager (slurm/gnu14)
Outcome:Passed
Duration:1.767 sec
FailedNone
None
Test case:[libs/GSL] run multifit under resource manager (slurm/gnu14)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
Skipping multifit test for ARCH=aarch64
Test case:[libs/GSL] run multilarge under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.203 sec
FailedNone
None
Test case:[libs/GSL] run multimin under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.162 sec
FailedNone
None
Test case:[libs/GSL] run multiroots under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.153 sec
FailedNone
None
Test case:[libs/GSL] run multiset under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.2 sec
FailedNone
None
Test case:[libs/GSL] run ntuple under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.146 sec
FailedNone
None
Test case:[libs/GSL] run ode-initval under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.618 sec
FailedNone
None
Test case:[libs/GSL] run ode-initval2 under resource manager (slurm/gnu14)
Outcome:Passed
Duration:9.457 sec
FailedNone
None
Test case:[libs/GSL] run permutation under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.138 sec
FailedNone
None
Test case:[libs/GSL] run poly under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.17 sec
FailedNone
None
Test case:[libs/GSL] run qrng under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.196 sec
FailedNone
None
Test case:[libs/GSL] run randist under resource manager (slurm/gnu14)
Outcome:Passed
Duration:1.655 sec
FailedNone
None
Test case:[libs/GSL] run rng under resource manager (slurm/gnu14)
Outcome:Passed
Duration:3.457 sec
FailedNone
None
Test case:[libs/GSL] run roots under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.16 sec
FailedNone
None
Test case:[libs/GSL] run rstat under resource manager (slurm/gnu14)
Outcome:Passed
Duration:6.904 sec
FailedNone
None
Test case:[libs/GSL] run siman under resource manager (slurm/gnu14)
Outcome:Passed
Duration:1.337 sec
FailedNone
None
Test case:[libs/GSL] run sort under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.201 sec
FailedNone
None
Test case:[libs/GSL] run spblas under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.379 sec
FailedNone
None
Test case:[libs/GSL] run specfunc under resource manager (slurm/gnu14)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
Skipping specfun test for ARCH=aarch64
Test case:[libs/GSL] run splinalg under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.43 sec
FailedNone
None
Test case:[libs/GSL] run spmatrix under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.184 sec
FailedNone
None
Test case:[libs/GSL] run statistics under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.172 sec
FailedNone
None
Test case:[libs/GSL] run sum under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.17 sec
FailedNone
None
Test case:[libs/GSL] run sys under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.164 sec
FailedNone
None
Test case:[libs/GSL] run vector under resource manager (slurm/gnu14)
Outcome:Passed
Duration:38.078 sec
FailedNone
None
Test case:[libs/GSL] run wavelet under resource manager (slurm/gnu14)
Outcome:Passed
Duration:1.001 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.149 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/GSL] Verify GSL module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.094 sec
FailedNone
None
Test case:[libs/GSL] Verify module GSL_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/GSL] Verify module GSL_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/GSL] Verify dynamic library available in GSL_LIB (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/GSL] Verify static library is not present in GSL_LIB (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/GSL] Verify module GSL_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/GSL] Verify header file is present in GSL_INC (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.2 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/MFEM] Verify MFEM module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.14 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.011 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.011 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/MFEM] Verify (dynamic) library available in MFEM_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/MFEM] Verify header file is present in MFEM_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.01 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration9.617 sec
Tests4
Failures4

Tests

rm_execution

Test case:[libs/MFEM] p_laplace MPI binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.146 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 24)
  `run_mpi_binary ./${binary} "-no-vis" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.6109
Batch job 175 submitted

Job 175 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./p_laplace -no-vis (family=openmpi5)
Test case:[libs/MFEM] p_laplace_perf MPI binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.114 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 34)
  `run_mpi_binary ./${binary} "-no-vis -rs 2" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.14412
Batch job 176 submitted

Job 176 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./p_laplace_perf -no-vis -rs 2 (family=openmpi5)
Test case:[libs/MFEM] p_cantilever MPI binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.178 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 44)
  `run_mpi_binary ./${binary} "-no-vis" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.22749
Batch job 177 submitted

Job 177 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./p_cantilever -no-vis (family=openmpi5)
Test case:[libs/MFEM] p_diffusion MPI binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.179 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 54)
  `run_mpi_binary ./${binary} "-no-vis" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.15697
Batch job 178 submitted

Job 178 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./p_diffusion -no-vis (family=openmpi5)

Test Suite: rm_execution

Results

Duration597.952 sec
Tests4
Failures4

Tests

rm_execution

Test case:[libs/MFEM] p_laplace MPI binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:147.697 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 24)
  `run_mpi_binary ./${binary} "-no-vis" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.26106
Batch job 171 submitted

Job 171 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./p_laplace -no-vis (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 171 ON c1 CANCELLED AT 2024-10-21T20:13:33 DUE TO TIME LIMIT ***
Test case:[libs/MFEM] p_laplace_perf MPI binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:150.778 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 34)
  `run_mpi_binary ./${binary} "-no-vis -rs 2" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.25249
Batch job 172 submitted

Job 172 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./p_laplace_perf -no-vis -rs 2 (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 172 ON c1 CANCELLED AT 2024-10-21T20:16:03 DUE TO TIME LIMIT ***
Test case:[libs/MFEM] p_cantilever MPI binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:148.712 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 44)
  `run_mpi_binary ./${binary} "-no-vis" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.12848
Batch job 173 submitted

Job 173 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./p_cantilever -no-vis (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 173 ON c1 CANCELLED AT 2024-10-21T20:18:32 DUE TO TIME LIMIT ***
Test case:[libs/MFEM] p_diffusion MPI binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:150.765 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution, line 54)
  `run_mpi_binary ./${binary} "-no-vis" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.6129
Batch job 174 submitted

Job 174 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./p_diffusion -no-vis (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 174 ON c1 CANCELLED AT 2024-10-21T20:21:02 DUE TO TIME LIMIT ***

Test Suite: test_module

Results

Duration0.193 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/MFEM] Verify MFEM module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.134 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.011 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/MFEM] Verify (dynamic) library available in MFEM_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/MFEM] Verify header file is present in MFEM_INC (gnu14/mpich)
Outcome:Passed
Duration:0.01 sec
FailedNone
None

Test Suite: test_module

Results

Duration132.021 sec
Tests9
Failures1

Tests

test_module

Test case:[libs/PETSc] Verify PETSC module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.126 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.011 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_BIN is defined and exists
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/PETSc] Verify dynamic library available in PETSC_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/PETSc] Verify static library is not present in PETSC_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/PETSc] Verify header file is present in PETSC_INC (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[libs/PETSc] Sample job (slurm/gnu14/mpich)
Outcome:Failed
Duration:131.829 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file test_module, line 130)
  `run_mpi_binary ./C_test "atest" 1 1' failed
job script = /tmp/job.ohpc-test.27107
Batch job 226 submitted

Job 226 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./C_test atest (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 226 ON c1 CANCELLED AT 2024-10-21T20:51:03 DUE TO TIME LIMIT ***

Test Suite: rm_execution_multi_host

Results

Duration3.169 sec
Tests2
Failures2

Tests

rm_execution_multi_host

Test case:[Apps/miniFE] run miniFE on multi nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.156 sec
FailedNone
(from function `run_mpi_binary' in file ./common-test/functions, line 414,
 in test file rm_execution_multi_host, line 46)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.2881
Batch job 15 submitted

Job 15 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./src/miniFE.x.gnu14.openmpi5 nx=256 ny=256 nz=256 verify_solution=0 (family=openmpi5)
Test case:[Apps/miniFE] log miniFE multi node results (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.013 sec
FailedNone
(in test file rm_execution_multi_host, line 54)
  `mv $run_yaml $wrk_yaml || exit 1' failed
ls: cannot access 'miniFE.256x256x256.P16.*.yaml': No such file or directory
mv: missing destination file operand after 'miniFE.256x256x256.P16.gnu14.openmpi5.yaml'
Try 'mv --help' for more information.

Test Suite: rm_execution_single_host

Results

Duration1.119 sec
Tests2
Failures2

Tests

rm_execution_single_host

Test case:[Apps/miniFE] run miniFE on single node under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.102 sec
FailedNone
(from function `run_mpi_binary' in file ./common-test/functions, line 414,
 in test file rm_execution_single_host, line 46)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.23582
Batch job 14 submitted

Job 14 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./src/miniFE.x.gnu14.openmpi5 nx=100 ny=100 nz=100 verify_solution=1 (family=openmpi5)
./src/miniFE.x.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
./src/miniFE.x.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
./src/miniFE.x.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
./src/miniFE.x.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
./src/miniFE.x.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
./src/miniFE.x.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
./src/miniFE.x.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
--------------------------------------------------------------------------
prterun detected that one or more processes exited with non-zero status,
thus causing the job to be terminated. The first process to do so was:

   Process name: [prterun-c1-3126@1,2] Exit code:    127
--------------------------------------------------------------------------
Test case:[Apps/miniFE] log miniFE single node results (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.017 sec
FailedNone
(from function `flunk' in file ./common-test/test_helper_functions.bash, line 14,
 in test file rm_execution_single_host, line 55)
  `mv $run_yaml $wrk_yaml || flunk "Unable to move ${run_yaml} file to ${work_yaml}"' failed
ls: cannot access 'miniFE.100x100x100.P8.*.yaml': No such file or directory
mv: missing destination file operand after 'miniFE.100x100x100.P8.gnu14.openmpi5.yaml'
Try 'mv --help' for more information.
Unable to move  file to

Test Suite: build

Results

Duration11.073 sec
Tests1
Failures0

Tests

build

Test case:[Apps/miniFE] build MiniFE executable (gnu14/openmpi5)
Outcome:Passed
Duration:11.073 sec
FailedNone
None

Test Suite: rm_execution_multi_host

Results

Duration330.647 sec
Tests2
Failures2

Tests

rm_execution_multi_host

Test case:[Apps/miniFE] run miniFE on multi nodes under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:330.634 sec
FailedNone
(from function `run_mpi_binary' in file ./common-test/functions, line 414,
 in test file rm_execution_multi_host, line 46)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.31660
Batch job 13 submitted

Job 13 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./src/miniFE.x.gnu14.mpich nx=256 ny=256 nz=256 verify_solution=0 (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 13 ON c1 CANCELLED AT 2024-10-21T18:57:03 DUE TO TIME LIMIT ***
Test case:[Apps/miniFE] log miniFE multi node results (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.013 sec
FailedNone
(in test file rm_execution_multi_host, line 54)
  `mv $run_yaml $wrk_yaml || exit 1' failed
ls: cannot access 'miniFE.256x256x256.P16.*.yaml': No such file or directory
mv: missing destination file operand after 'miniFE.256x256x256.P16.gnu14.mpich.yaml'
Try 'mv --help' for more information.

Test Suite: rm_execution_single_host

Results

Duration329.687 sec
Tests2
Failures2

Tests

rm_execution_single_host

Test case:[Apps/miniFE] run miniFE on single node under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:329.671 sec
FailedNone
(from function `run_mpi_binary' in file ./common-test/functions, line 414,
 in test file rm_execution_single_host, line 46)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.27621
Batch job 12 submitted

Job 12 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./src/miniFE.x.gnu14.mpich nx=100 ny=100 nz=100 verify_solution=1 (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 12 ON c1 CANCELLED AT 2024-10-21T18:51:32 DUE TO TIME LIMIT ***
Test case:[Apps/miniFE] log miniFE single node results (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.016 sec
FailedNone
(from function `flunk' in file ./common-test/test_helper_functions.bash, line 14,
 in test file rm_execution_single_host, line 55)
  `mv $run_yaml $wrk_yaml || flunk "Unable to move ${run_yaml} file to ${work_yaml}"' failed
ls: cannot access 'miniFE.100x100x100.P8.*.yaml': No such file or directory
mv: missing destination file operand after 'miniFE.100x100x100.P8.gnu14.mpich.yaml'
Try 'mv --help' for more information.
Unable to move  file to

Test Suite: build

Results

Duration14.329 sec
Tests1
Failures0

Tests

build

Test case:[Apps/miniFE] build MiniFE executable (gnu14/mpich)
Outcome:Passed
Duration:14.329 sec
FailedNone
None

Test Suite: rm_execution_multi_host

Results

Duration2.153 sec
Tests3
Failures2

Tests

rm_execution_multi_host

Test case:[/apps/hpcg] check if resource manager is defined
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[/apps/hpcg] run HPCG on multi nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.129 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution_multi_host, line 46)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.1798
Batch job 11 submitted

Job 11 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./bin/xhpcg.gnu14.openmpi5 32 32 32 10 (family=openmpi5)
Test case:[/apps/hpcg] log HPCG multi node results (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.015 sec
FailedNone
(in test file rm_execution_multi_host, line 55)
  `mv $run_yaml $wrk_yaml' failed
Finding latest HPCG-Bencmark-*.yaml in /home/ohpc-test/tests/apps/hpcg
ls: cannot access 'HPCG-Benchmark-*.yaml': No such file or directory
Moving  to HPCG.32x32x32.P16.gnu14.openmpi5.yaml
mv: missing destination file operand after 'HPCG.32x32x32.P16.gnu14.openmpi5.yaml'
Try 'mv --help' for more information.

Test Suite: rm_execution_single_host

Results

Duration1.127 sec
Tests3
Failures2

Tests

rm_execution_single_host

Test case:[/apps/hpcg] check if resource manager is defined
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[/apps/hpcg] run HPCG on single node under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.103 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution_single_host, line 46)
  `run_mpi_binary -t $CMD_TIMEOUT ${EXE} "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.24985
Batch job 10 submitted

Job 10 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- ./bin/xhpcg.gnu14.openmpi5 32 32 32 10 (family=openmpi5)
./bin/xhpcg.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
./bin/xhpcg.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
./bin/xhpcg.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
./bin/xhpcg.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
./bin/xhpcg.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
./bin/xhpcg.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
./bin/xhpcg.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
./bin/xhpcg.gnu14.openmpi5: error while loading shared libraries: librdmacm.so.1: cannot open shared object file: No such file or directory
--------------------------------------------------------------------------
prterun detected that one or more processes exited with non-zero status,
thus causing the job to be terminated. The first process to do so was:

   Process name: [prterun-c1-2922@1,0] Exit code:    127
--------------------------------------------------------------------------
Test case:[/apps/hpcg] log HPCG single node results (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.015 sec
FailedNone
(in test file rm_execution_single_host, line 55)
  `mv $run_yaml $wrk_yaml' failed
Finding latest HPCG-Bencmark-*.yaml in /home/ohpc-test/tests/apps/hpcg
ls: cannot access 'HPCG-Benchmark-*.yaml': No such file or directory
Moving  to HPCG.32x32x32.P8.gnu14.openmpi5.yaml
mv: missing destination file operand after 'HPCG.32x32x32.P8.gnu14.openmpi5.yaml'
Try 'mv --help' for more information.

Test Suite: build

Results

Duration5.905 sec
Tests1
Failures0

Tests

build

Test case:[/apps/hpcg] build HPCG executable (gnu14/openmpi5)
Outcome:Passed
Duration:5.905 sec
FailedNone
None

Test Suite: rm_execution_multi_host

Results

Duration330.639 sec
Tests3
Failures2

Tests

rm_execution_multi_host

Test case:[/apps/hpcg] check if resource manager is defined
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[/apps/hpcg] run HPCG on multi nodes under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:330.616 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution_multi_host, line 46)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.19567
Batch job 9 submitted

Job 9 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./bin/xhpcg.gnu14.mpich 32 32 32 10 (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 9 ON c1 CANCELLED AT 2024-10-21T18:45:34 DUE TO TIME LIMIT ***
Test case:[/apps/hpcg] log HPCG multi node results (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.014 sec
FailedNone
(in test file rm_execution_multi_host, line 55)
  `mv $run_yaml $wrk_yaml' failed
Finding latest HPCG-Bencmark-*.yaml in /home/ohpc-test/tests/apps/hpcg
ls: cannot access 'HPCG-Benchmark-*.yaml': No such file or directory
Moving  to HPCG.32x32x32.P16.gnu14.mpich.yaml
mv: missing destination file operand after 'HPCG.32x32x32.P16.gnu14.mpich.yaml'
Try 'mv --help' for more information.

Test Suite: rm_execution_single_host

Results

Duration308.209 sec
Tests3
Failures2

Tests

rm_execution_single_host

Test case:[/apps/hpcg] check if resource manager is defined
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[/apps/hpcg] run HPCG on single node under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:308.186 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file rm_execution_single_host, line 46)
  `run_mpi_binary -t $CMD_TIMEOUT ${EXE} "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.17027
Batch job 8 submitted

Job 8 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm ./bin/xhpcg.gnu14.mpich 32 32 32 10 (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 8 ON c1 CANCELLED AT 2024-10-21T18:40:02 DUE TO TIME LIMIT ***
Test case:[/apps/hpcg] log HPCG single node results (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.015 sec
FailedNone
(in test file rm_execution_single_host, line 55)
  `mv $run_yaml $wrk_yaml' failed
Finding latest HPCG-Bencmark-*.yaml in /home/ohpc-test/tests/apps/hpcg
ls: cannot access 'HPCG-Benchmark-*.yaml': No such file or directory
Moving  to HPCG.32x32x32.P8.gnu14.mpich.yaml
mv: missing destination file operand after 'HPCG.32x32x32.P8.gnu14.mpich.yaml'
Try 'mv --help' for more information.

Test Suite: build

Results

Duration7.527 sec
Tests1
Failures0

Tests

build

Test case:[/apps/hpcg] build HPCG executable (gnu14/mpich)
Outcome:Passed
Duration:7.527 sec
FailedNone
None

Test Suite: version_match

Results

Duration0.248 sec
Tests3
Failures0

Tests

version_match

Test case:[Compilers] compiler module loaded (gnu14)
Outcome:Passed
Duration:0.074 sec
FailedNone
None
Test case:[Compilers] compiler module version available (gnu14)
Outcome:Passed
Duration:0.078 sec
FailedNone
None
Test case:[Compilers] C, C++, and Fortran versions match module (gnu14)
Outcome:Passed
Duration:0.096 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.006 sec
Tests6
Failures0

Tests

rm_execution

Test case:[Compilers] C binary runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.152 sec
FailedNone
None
Test case:[Compilers] C++ binary runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.166 sec
FailedNone
None
Test case:[Compilers] Fortran binary runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.185 sec
FailedNone
None
Test case:[Compilers] C openmp binary runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.18 sec
FailedNone
None
Test case:[Compilers] C++ openmp binary runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.172 sec
FailedNone
None
Test case:[Compilers] Fortran openmp binary runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.151 sec
FailedNone
None

Test Suite: debugger

Results

Duration0.017 sec
Tests2
Failures0

Tests

debugger

Test case:[Compilers] debugger man page (gnu14)
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[Compilers] debugger availability (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: man_pages

Results

Duration0.068 sec
Tests3
Failures0

Tests

man_pages

Test case:[Compilers] C compiler man/help page (gnu14)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[Compilers] C++ compiler man/help page (gnu14)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[Compilers] Fortran compiler man/help page (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.394 sec
Tests2
Failures0

Tests

rm_execution

Test case:[dev-tools/hwloc] lstopo runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.207 sec
FailedNone
None
Test case:[dev-tools/hwloc] hwloc_hello runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.187 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.143 sec
Tests7
Failures0

Tests

test_module

Test case:[HWLOC] Verify HWLOC module is loaded and matches rpm version (gnu14/)
Outcome:Passed
Duration:0.093 sec
FailedNone
None
Test case:[HWLOC] Verify module HWLOC_DIR is defined and exists (gnu14/)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[HWLOC] Verify module HWLOC_LIB is defined and exists (gnu14/)
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[HWLOC] Verify dynamic library available in HWLOC_LIB (gnu14/)
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[HWLOC] Verify static library is not present in HWLOC_LIB (gnu14/)
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[HWLOC] Verify module HWLOC_INC is defined and exists (gnu14/)
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[HWLOC] Verify header file is present in HWLOC_INC (gnu14/)
Outcome:Passed
Duration:0.008 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.644 sec
Tests2
Failures0

Tests

rm_execution

Test case:[Valgrind] Callgrind execution under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.49 sec
FailedNone
None
Test case:[Valgrind] Memcheck execution under resource manager (slurm/gnu14)
Outcome:Passed
Duration:1.154 sec
FailedNone
None

Test Suite: test_module

Results

Duration5.067 sec
Tests8
Failures0

Tests

test_module

Test case:[Valgrind] Verify valgrind module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.093 sec
FailedNone
None
Test case:[Valgrind] Verify module VALGRIND_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[Valgrind] Verify availability of valgrind binary (gnu14)
Outcome:Passed
Duration:0.016 sec
FailedNone
None
Test case:[Valgrind] Verify availability of man page (gnu14)
Outcome:Passed
Duration:0.026 sec
FailedNone
None
Test case:[Valgrind] Verify module VALGRIND_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[Valgrind] Verify header file is present in VALGRIND_INC (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[Valgrind] Callgrind compile/test (gnu14)
Outcome:Passed
Duration:3.332 sec
FailedNone
None
Test case:[Valgrind] Memcheck compile/test (gnu14)
Outcome:Passed
Duration:1.573 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.027 sec
Tests3
Failures0

Tests

test_module

Test case:[mpi4py] Verify module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[mpi4py] Verify module MPI4PY_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[mpi4py] Verify PYTHONPATH is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: hipy

Results

Duration137.685 sec
Tests1
Failures1

Tests

hipy

Test case:[dev-tools/py3-mpi4py] python hello world (slurm/gnu14/mpich)
Outcome:Failed
Duration:137.685 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file hipy, line 35)
  `run_mpi_binary "${_python} helloworld.py" $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.9454
Batch job 24 submitted

Job 24 encountered timeout...
Reason=TimeLimit

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpiexec.hydra -bootstrap slurm python3 helloworld.py 8 (family=mpich)
/bin/srun: unrecognized option '--external-launcher'
Try "srun --help" for more information
slurmstepd-c1: error: *** JOB 24 ON c1 CANCELLED AT 2024-10-21T19:00:03 DUE TO TIME LIMIT ***

Test Suite: test_module

Results

Duration0.027 sec
Tests3
Failures0

Tests

test_module

Test case:[mpi4py] Verify module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[mpi4py] Verify module MPI4PY_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[mpi4py] Verify PYTHONPATH is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: hipy

Results

Duration1.1 sec
Tests1
Failures1

Tests

hipy

Test case:[dev-tools/py3-mpi4py] python hello world (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.1 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 414,
 in test file hipy, line 35)
  `run_mpi_binary "${_python} helloworld.py" $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.6893
Batch job 25 submitted

Job 25 failed...
Reason=NonZeroExitCode

[prun] Master compute host = c1
[prun] Resource manager = slurm
[prun] Launch cmd = mpirun -- python3 helloworld.py 8 (family=openmpi5)

Test Suite: test_cmake

Results

Duration1.492 sec
Tests4
Failures0

Tests

test_cmake

Test case:[/dev-tools/cmake] running cmake --system-information
Outcome:Passed
Duration:0.607 sec
FailedNone
None
Test case:[/dev-tools/cmake] run cmake
Outcome:Passed
Duration:0.303 sec
FailedNone
None
Test case:[/dev-tools/cmake] run make on generated Makefile
Outcome:Passed
Duration:0.573 sec
FailedNone
None
Test case:[/dev-tools/cmake] run compiled binary
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.036 sec
Tests4
Failures0

Tests

test_module

Test case:[Numpy] Verify NUMPY modules can be loaded and match rpm version (gnu14)
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[Numpy] Verify module NUMPY_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[Numpy] Verify module NUMPY_BIN is defined and exists
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[Numpy] Verify PYTHONPATH is defined and exists (gnu14)
Outcome:Passed
Duration:0.008 sec
FailedNone
None

Test Suite: MM

Results

Duration3.744 sec
Tests1
Failures0

Tests

MM

Test case:[dev-tools/py3-numpy] Numpy Matrix Multiply
Outcome:Passed
Duration:3.744 sec
FailedNone
None

Test Suite: springs

Results

Duration0.405 sec
Tests1
Failures0

Tests

springs

Test case:[dev-tools/py3-scipy] Coupled Spring-Mass System (/gnu14/mpich)
Outcome:Passed
Duration:0.405 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.028 sec
Tests3
Failures0

Tests

test_module

Test case:[scipy] Verify SCIPY module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None
Test case:[scipy] Verify module SCIPY_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.01 sec
FailedNone
None
Test case:[scipy] Verify PYTHONPATH is defined and exists (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: springs

Results

Duration0.391 sec
Tests1
Failures0

Tests

springs

Test case:[dev-tools/py3-scipy] Coupled Spring-Mass System (/gnu14/openmpi5)
Outcome:Passed
Duration:0.391 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.028 sec
Tests3
Failures0

Tests

test_module

Test case:[scipy] Verify SCIPY module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.008 sec
FailedNone
None
Test case:[scipy] Verify module SCIPY_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.011 sec
FailedNone
None
Test case:[scipy] Verify PYTHONPATH is defined and exists (gnu14)
Outcome:Passed
Duration:0.009 sec
FailedNone
None

Test Suite: test_autotools

Results

Duration4.154 sec
Tests4
Failures0

Tests

test_autotools

Test case:[/dev-tools/autotools] running autoreconf
Outcome:Passed
Duration:2.741 sec
FailedNone
None
Test case:[/dev-tools/autotools] run generated configure
Outcome:Passed
Duration:1.312 sec
FailedNone
None
Test case:[/dev-tools/autotools] run make on generated Makefile
Outcome:Passed
Duration:0.091 sec
FailedNone
None
Test case:[/dev-tools/autotools] run compiled binary
Outcome:Passed
Duration:0.01 sec
FailedNone
None

Test Suite: EasyBuild

Results

Duration21.657 sec
Tests3
Failures0

Tests

EasyBuild

Test case:[EasyBuild] check for RPM
Outcome:Passed
Duration:0.056 sec
FailedNone
None
Test case:[EasyBuild] test executable
Outcome:Passed
Duration:1.385 sec
FailedNone
None
Test case:[EasyBuild] quick test install of bzip2
Outcome:Passed
Duration:20.216 sec
FailedNone
None

Test Suite: sinfo

Results

Duration0.04 sec
Tests1
Failures0

Tests

sinfo

Test case:[slurm] Verify SLURM RPM version matches sinfo binary
Outcome:Passed
Duration:0.04 sec
FailedNone
None

Test Suite: munge

Results

Duration1.063 sec
Tests4
Failures0

Tests

munge

Test case:[munge] check for OS provdied RPM
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[munge] Generate a credential
Outcome:Passed
Duration:0.014 sec
FailedNone
None
Test case:[munge] Decode credential locally
Outcome:Passed
Duration:0.011 sec
FailedNone
None
Test case:[munge] Run benchmark
Outcome:Passed
Duration:1.015 sec
FailedNone
None

Test Suite: sacct

Results

Duration0.0 sec
Tests1
Failures0

Tests

sacct

Test case:[slurm] check for working sacct
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
Temporarily skip sacct check

Test Suite: mem_limits

Results

Duration0.405 sec
Tests2
Failures0

Tests

mem_limits

Test case:[memlock] check increased soft limit
Outcome:Passed
Duration:0.213 sec
FailedNone
None
Test case:[memlock] check increased hard limit
Outcome:Passed
Duration:0.192 sec
FailedNone
None

Test Suite: ompi_info

Results

Duration0.438 sec
Tests1
Failures0

Tests

ompi_info

Test case:[openmpi] check for no output to stderr with ompi_info
Outcome:Passed
Duration:0.438 sec
FailedNone
None

Test Suite: pdsh

Results

Duration0.438 sec
Tests4
Failures0

Tests

pdsh

Test case:[pdsh] check for RPM
Outcome:Passed
Duration:0.025 sec
FailedNone
None
Test case:[pdsh] run a shell command on c[1-4]
Outcome:Passed
Duration:0.192 sec
FailedNone
None
Test case:[pdsh] check for pdsh-mod-slurm RPM
Outcome:Passed
Duration:0.025 sec
FailedNone
None
Test case:[pdsh] run a shell command on -P normal
Outcome:Passed
Duration:0.196 sec
FailedNone
None

Test Suite: magpie

Results

Duration0.706 sec
Tests3
Failures0

Tests

magpie

Test case:[magpie] check for RPM
Outcome:Passed
Duration:0.025 sec
FailedNone
None
Test case:[magpie] Verify MAGPIE module is loaded and matches rpm version
Outcome:Passed
Duration:0.393 sec
FailedNone
None
Test case:[magpie] Verify module MAGPIE_DIR is defined and exists
Outcome:Passed
Duration:0.288 sec
FailedNone
None

Test Suite: computes

Results

Duration1.114 sec
Tests4
Failures0

Tests

computes

Test case:[BOS] OS distribution matches (2 active computes)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
disable BOS_RELEASE check
Test case:[BOS] consistent kernel (2 active computes)
Outcome:Passed
Duration:0.556 sec
FailedNone
None
Test case:[BOS] increased locked memory limits
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
Skipping memlock settings for ARCH=aarch64
Test case:[BOS] syslog forwarding
Outcome:Passed
Duration:0.558 sec
FailedNone
None

Test Suite: os_distribution

Results

Duration0.008 sec
Tests1
Failures0

Tests

os_distribution

Test case:[BOS] OS distribution matches (local)
Outcome:Passed
Duration:0.008 sec
FailedNone
None

Test Suite: slurm-plugins

Results

Duration1.401 sec
Tests4
Failures0

Tests

slurm-plugins

Test case:[slurm] check for jobcomp_elasticsearch plugin
Outcome:Passed
Duration:0.014 sec
FailedNone
None
Test case:[slurm] check for job_submit_lua plugin
Outcome:Passed
Duration:0.014 sec
FailedNone
None
Test case:[slurm] check for --x11 option
Outcome:Passed
Duration:0.017 sec
FailedNone
None
Test case:[slurm] check for sview rpm availability
Outcome:Passed
Duration:1.356 sec
FailedNone
None

Test Suite: clustershell

Results

Duration0.552 sec
Tests2
Failures0

Tests

clustershell

Test case:[clush] check for OS-provided RPM
Outcome:Passed
Duration:0.048 sec
FailedNone
None
Test case:[clush] clush -Sg compute
Outcome:Passed
Duration:0.504 sec
FailedNone
None

Test Suite: genders

Results

Duration0.037 sec
Tests2
Failures0

Tests

genders

Test case:[genders] check for RPM
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[genders] check node attributes
Outcome:Passed
Duration:0.013 sec
FailedNone
None

Test Suite: conman

Results

Duration0.039 sec
Tests2
Failures0

Tests

conman

Test case:[conman] check for RPM
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[conman] query conmand conman
Outcome:Passed
Duration:0.015 sec
FailedNone
None

Test Suite: nhc

Results

Duration5.263 sec
Tests3
Failures0

Tests

nhc

Test case:[nhc] check for RPM
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[nhc] generate config file
Outcome:Passed
Duration:2.583 sec
FailedNone
None
Test case:[nhc] service failure detection and restart
Outcome:Passed
Duration:2.656 sec
FailedNone
None

Test Suite: lmod

Results

Duration0.356 sec
Tests1
Failures0

Tests

lmod

Test case:[lmod] test that the setup function passed
Outcome:Passed
Duration:0.356 sec
FailedNone
None

Test Suite: spack

Results

Duration100.083 sec
Tests4
Failures0

Tests

spack

Test case:[spack] check for RPM
Outcome:Passed
Duration:0.448 sec
FailedNone
None
Test case:[spack] add compiler
Outcome:Passed
Duration:1.989 sec
FailedNone
None
Test case:[spack] test build
Outcome:Passed
Duration:96.205 sec
FailedNone
None
Test case:[spack] test module refresh
Outcome:Passed
Duration:1.441 sec
FailedNone
None

Test Suite: ntp

Results

Duration0.381 sec
Tests3
Failures0

Tests

ntp

Test case:[ntp] check for chronyc binary
Outcome:Passed
Duration:0.015 sec
FailedNone
None
Test case:[ntp] verify local time in sync on SMS
Outcome:Passed
Duration:0.203 sec
FailedNone
None
Test case:[ntp] verify local time in sync on compute
Outcome:Passed
Duration:0.163 sec
FailedNone
None

Test Suite: warewulf-ipmi

Results

Duration0.231 sec
Tests1
Failures0

Tests

warewulf-ipmi

Test case:[warewulf-ipmi] ipmitool lanplus protocol
Outcome:Passed
Duration:0.231 sec
FailedNone
None

Test Suite: ipmitool

Results

Duration0.03 sec
Tests6
Failures0

Tests

ipmitool

Test case:[OOB] ipmitool exists
Outcome:Passed
Duration:0.015 sec
FailedNone
None
Test case:[OOB] istat exists
Outcome:Passed
Duration:0.015 sec
FailedNone
None
Test case:[OOB] ipmitool local bmc ping
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
Skipping local bmc ping for ARCH=aarch64
Test case:[OOB] ipmitool power status
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
Skipping local power status for ARCH=aarch64
Test case:[OOB] ipmitool read CPU1 sensor data
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
Skipping read CPU1 data for ARCH=aarch64
Test case:[OOB] ipmitool read sel log
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipp sel log read as entry 1 may not always be available

Test Suite: conman

Results

Duration0.098 sec
Tests3
Failures0

Tests

conman

Test case:[ConMan] Verify conman binary available
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[ConMan] Verify rpm version matches binary
Outcome:Passed
Duration:0.028 sec
FailedNone
None
Test case:[ConMan] Verify man page availability
Outcome:Passed
Duration:0.05 sec
FailedNone
None