Test Report : results

Test Suite: InstallTests.computes_installed-20250321144005

Results

Duration1.062 sec
Tests5
Failures0

Tests

InstallTests.computes_installed

Test case:test_01_numcomputes_greater_than_zero
Outcome:Passed
Duration:0.0 sec
FailedNone
None
Test case:test_02_koomie_cf_available
Outcome:Passed
Duration:0.0 sec
FailedNone
None
Test case:test_03_nonzero_results_from_uptime
Outcome:Passed
Duration:1.062 sec
FailedNone
None
Test case:test_04_correct_number_of_hosts_booted
Outcome:Passed
Duration:0.0 sec
FailedNone
None
Test case:test_05_verify_boot_times_are_reasonable
Outcome:Passed
Duration:0.0 sec
FailedNone
None

Test Suite: sms_installed.bats

Results

Duration0.052 sec
Tests2
Failures0

Tests

sms_installed.bats

Test case:Verify hostname matches expectations
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:Base OS check
Outcome:Passed
Duration:0.029 sec
FailedNone
None

Test Suite: clustershell

Results

Duration1.103 sec
Tests2
Failures0

Tests

clustershell

Test case:[clush] check for OS-provided RPM
Outcome:Passed
Duration:0.052 sec
FailedNone
None
Test case:[clush] clush -Sg compute
Outcome:Passed
Duration:1.051 sec
FailedNone
None

Test Suite: conman

Results

Duration0.081 sec
Tests2
Failures0

Tests

conman

Test case:[conman] check for RPM
Outcome:Passed
Duration:0.05 sec
FailedNone
None
Test case:[conman] query conmand conman
Outcome:Passed
Duration:0.031 sec
FailedNone
None

Test Suite: genders

Results

Duration0.081 sec
Tests2
Failures0

Tests

genders

Test case:[genders] check for RPM
Outcome:Passed
Duration:0.05 sec
FailedNone
None
Test case:[genders] check node attributes
Outcome:Passed
Duration:0.031 sec
FailedNone
None

Test Suite: nhc

Results

Duration3.553 sec
Tests3
Failures0

Tests

nhc

Test case:[nhc] check for RPM
Outcome:Passed
Duration:0.049 sec
FailedNone
None
Test case:[nhc] generate config file
Outcome:Passed
Duration:1.161 sec
FailedNone
None
Test case:[nhc] service failure detection and restart
Outcome:Passed
Duration:2.343 sec
FailedNone
None

Test Suite: slurm-plugins

Results

Duration1.397 sec
Tests4
Failures0

Tests

slurm-plugins

Test case:[slurm] check for jobcomp_elasticsearch plugin
Outcome:Passed
Duration:0.03 sec
FailedNone
None
Test case:[slurm] check for job_submit_lua plugin
Outcome:Passed
Duration:0.031 sec
FailedNone
None
Test case:[slurm] check for --x11 option
Outcome:Passed
Duration:0.04 sec
FailedNone
None
Test case:[slurm] check for sview rpm availability
Outcome:Passed
Duration:1.296 sec
FailedNone
None

Test Suite: lmod

Results

Duration0.431 sec
Tests1
Failures0

Tests

lmod

Test case:[lmod] test that the setup function passed
Outcome:Passed
Duration:0.431 sec
FailedNone
None

Test Suite: spack

Results

Duration53.237 sec
Tests4
Failures0

Tests

spack

Test case:[spack] check for RPM
Outcome:Passed
Duration:0.55 sec
FailedNone
None
Test case:[spack] add compiler
Outcome:Passed
Duration:1.825 sec
FailedNone
None
Test case:[spack] test build
Outcome:Passed
Duration:49.548 sec
FailedNone
None
Test case:[spack] test module refresh
Outcome:Passed
Duration:1.314 sec
FailedNone
None

Test Suite: build

Results

Duration6.504 sec
Tests1
Failures0

Tests

build

Test case:[/apps/hpcg] build HPCG executable (gnu14/mpich)
Outcome:Passed
Duration:6.504 sec
FailedNone
None

Test Suite: rm_execution_multi_host

Results

Duration2.33 sec
Tests3
Failures2

Tests

rm_execution_multi_host

Test case:[/apps/hpcg] check if resource manager is defined
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[/apps/hpcg] run HPCG on multi nodes under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.278 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution_multi_host, line 46)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.8467
Batch job 14 submitted

Job 14 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.14.out: No such file or directory
Test case:[/apps/hpcg] log HPCG multi node results (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.033 sec
FailedNone
(in test file rm_execution_multi_host, line 55)
  `mv $run_yaml $wrk_yaml' failed
Finding latest HPCG-Bencmark-*.yaml in /home/ohpc-test/tests/apps/hpcg
ls: cannot access 'HPCG-Benchmark-*.yaml': No such file or directory
Moving  to HPCG.32x32x32.P16.gnu14.mpich.yaml
mv: missing destination file operand after 'HPCG.32x32x32.P16.gnu14.mpich.yaml'
Try 'mv --help' for more information.

Test Suite: rm_execution_single_host

Results

Duration1.267 sec
Tests3
Failures2

Tests

rm_execution_single_host

Test case:[/apps/hpcg] check if resource manager is defined
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[/apps/hpcg] run HPCG on single node under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.216 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution_single_host, line 46)
  `run_mpi_binary -t $CMD_TIMEOUT ${EXE} "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.15976
Batch job 13 submitted

Job 13 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.13.out: No such file or directory
Test case:[/apps/hpcg] log HPCG single node results (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.032 sec
FailedNone
(in test file rm_execution_single_host, line 55)
  `mv $run_yaml $wrk_yaml' failed
Finding latest HPCG-Bencmark-*.yaml in /home/ohpc-test/tests/apps/hpcg
ls: cannot access 'HPCG-Benchmark-*.yaml': No such file or directory
Moving  to HPCG.32x32x32.P8.gnu14.mpich.yaml
mv: missing destination file operand after 'HPCG.32x32x32.P8.gnu14.mpich.yaml'
Try 'mv --help' for more information.

Test Suite: build

Results

Duration4.469 sec
Tests1
Failures0

Tests

build

Test case:[/apps/hpcg] build HPCG executable (gnu14/openmpi5)
Outcome:Passed
Duration:4.469 sec
FailedNone
None

Test Suite: rm_execution_multi_host

Results

Duration2.343 sec
Tests3
Failures2

Tests

rm_execution_multi_host

Test case:[/apps/hpcg] check if resource manager is defined
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[/apps/hpcg] run HPCG on multi nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.288 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution_multi_host, line 46)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.10400
Batch job 16 submitted

Job 16 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.16.out: No such file or directory
Test case:[/apps/hpcg] log HPCG multi node results (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.035 sec
FailedNone
(in test file rm_execution_multi_host, line 55)
  `mv $run_yaml $wrk_yaml' failed
Finding latest HPCG-Bencmark-*.yaml in /home/ohpc-test/tests/apps/hpcg
ls: cannot access 'HPCG-Benchmark-*.yaml': No such file or directory
Moving  to HPCG.32x32x32.P16.gnu14.openmpi5.yaml
mv: missing destination file operand after 'HPCG.32x32x32.P16.gnu14.openmpi5.yaml'
Try 'mv --help' for more information.

Test Suite: rm_execution_single_host

Results

Duration1.276 sec
Tests3
Failures2

Tests

rm_execution_single_host

Test case:[/apps/hpcg] check if resource manager is defined
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[/apps/hpcg] run HPCG on single node under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.221 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution_single_host, line 46)
  `run_mpi_binary -t $CMD_TIMEOUT ${EXE} "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.15017
Batch job 15 submitted

Job 15 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.15.out: No such file or directory
Test case:[/apps/hpcg] log HPCG single node results (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.035 sec
FailedNone
(in test file rm_execution_single_host, line 55)
  `mv $run_yaml $wrk_yaml' failed
Finding latest HPCG-Bencmark-*.yaml in /home/ohpc-test/tests/apps/hpcg
ls: cannot access 'HPCG-Benchmark-*.yaml': No such file or directory
Moving  to HPCG.32x32x32.P8.gnu14.openmpi5.yaml
mv: missing destination file operand after 'HPCG.32x32x32.P8.gnu14.openmpi5.yaml'
Try 'mv --help' for more information.

Test Suite: build

Results

Duration10.689 sec
Tests1
Failures0

Tests

build

Test case:[Apps/miniFE] build MiniFE executable (gnu14/mpich)
Outcome:Passed
Duration:10.689 sec
FailedNone
None

Test Suite: rm_execution_multi_host

Results

Duration1.241 sec
Tests2
Failures2

Tests

rm_execution_multi_host

Test case:[Apps/miniFE] run miniFE on multi nodes under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.212 sec
FailedNone
(from function `run_mpi_binary' in file ./common-test/functions, line 403,
 in test file rm_execution_multi_host, line 46)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.335
Batch job 18 submitted

Job 18 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.18.out: No such file or directory
Test case:[Apps/miniFE] log miniFE multi node results (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.029 sec
FailedNone
(in test file rm_execution_multi_host, line 54)
  `mv $run_yaml $wrk_yaml || exit 1' failed
ls: cannot access 'miniFE.256x256x256.P16.*.yaml': No such file or directory
mv: missing destination file operand after 'miniFE.256x256x256.P16.gnu14.mpich.yaml'
Try 'mv --help' for more information.

Test Suite: rm_execution_single_host

Results

Duration1.252 sec
Tests2
Failures2

Tests

rm_execution_single_host

Test case:[Apps/miniFE] run miniFE on single node under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.215 sec
FailedNone
(from function `run_mpi_binary' in file ./common-test/functions, line 403,
 in test file rm_execution_single_host, line 46)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.10854
Batch job 17 submitted

Job 17 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.17.out: No such file or directory
Test case:[Apps/miniFE] log miniFE single node results (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.037 sec
FailedNone
(from function `flunk' in file ./common-test/test_helper_functions.bash, line 14,
 in test file rm_execution_single_host, line 55)
  `mv $run_yaml $wrk_yaml || flunk "Unable to move ${run_yaml} file to ${work_yaml}"' failed
ls: cannot access 'miniFE.100x100x100.P8.*.yaml': No such file or directory
mv: missing destination file operand after 'miniFE.100x100x100.P8.gnu14.mpich.yaml'
Try 'mv --help' for more information.
Unable to move  file to

Test Suite: build

Results

Duration7.21 sec
Tests1
Failures0

Tests

build

Test case:[Apps/miniFE] build MiniFE executable (gnu14/openmpi5)
Outcome:Passed
Duration:7.21 sec
FailedNone
None

Test Suite: rm_execution_multi_host

Results

Duration1.243 sec
Tests2
Failures2

Tests

rm_execution_multi_host

Test case:[Apps/miniFE] run miniFE on multi nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.213 sec
FailedNone
(from function `run_mpi_binary' in file ./common-test/functions, line 403,
 in test file rm_execution_multi_host, line 46)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.25585
Batch job 20 submitted

Job 20 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.20.out: No such file or directory
Test case:[Apps/miniFE] log miniFE multi node results (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.03 sec
FailedNone
(in test file rm_execution_multi_host, line 54)
  `mv $run_yaml $wrk_yaml || exit 1' failed
ls: cannot access 'miniFE.256x256x256.P16.*.yaml': No such file or directory
mv: missing destination file operand after 'miniFE.256x256x256.P16.gnu14.openmpi5.yaml'
Try 'mv --help' for more information.

Test Suite: rm_execution_single_host

Results

Duration1.253 sec
Tests2
Failures2

Tests

rm_execution_single_host

Test case:[Apps/miniFE] run miniFE on single node under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.216 sec
FailedNone
(from function `run_mpi_binary' in file ./common-test/functions, line 403,
 in test file rm_execution_single_host, line 46)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.12459
Batch job 19 submitted

Job 19 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.19.out: No such file or directory
Test case:[Apps/miniFE] log miniFE single node results (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.037 sec
FailedNone
(from function `flunk' in file ./common-test/test_helper_functions.bash, line 14,
 in test file rm_execution_single_host, line 55)
  `mv $run_yaml $wrk_yaml || flunk "Unable to move ${run_yaml} file to ${work_yaml}"' failed
ls: cannot access 'miniFE.100x100x100.P8.*.yaml': No such file or directory
mv: missing destination file operand after 'miniFE.100x100x100.P8.gnu14.openmpi5.yaml'
Try 'mv --help' for more information.
Unable to move  file to

Test Suite: os_distribution

Results

Duration0.019 sec
Tests1
Failures0

Tests

os_distribution

Test case:[BOS] OS distribution matches (local)
Outcome:Passed
Duration:0.019 sec
FailedNone
None

Test Suite: computes

Results

Duration4.374 sec
Tests4
Failures0

Tests

computes

Test case:[BOS] OS distribution matches (2 active computes)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
disable BOS_RELEASE check
Test case:[BOS] consistent kernel (2 active computes)
Outcome:Passed
Duration:1.102 sec
FailedNone
None
Test case:[BOS] increased locked memory limits
Outcome:Passed
Duration:2.179 sec
FailedNone
None
Test case:[BOS] syslog forwarding
Outcome:Passed
Duration:1.093 sec
FailedNone
None

Test Suite: debugger

Results

Duration0.035 sec
Tests2
Failures0

Tests

debugger

Test case:[Compilers] debugger man page (gnu14)
Outcome:Passed
Duration:0.016 sec
FailedNone
None
Test case:[Compilers] debugger availability (gnu14)
Outcome:Passed
Duration:0.019 sec
FailedNone
None

Test Suite: man_pages

Results

Duration0.152 sec
Tests3
Failures0

Tests

man_pages

Test case:[Compilers] C compiler man/help page (gnu14)
Outcome:Passed
Duration:0.052 sec
FailedNone
None
Test case:[Compilers] C++ compiler man/help page (gnu14)
Outcome:Passed
Duration:0.052 sec
FailedNone
None
Test case:[Compilers] Fortran compiler man/help page (gnu14)
Outcome:Passed
Duration:0.048 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.79 sec
Tests6
Failures6

Tests

rm_execution

Test case:[Compilers] C binary runs under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.132 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 19)
  `run_serial_binary ./C_test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/compilers/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/compilers/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./C_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Compilers] C++ binary runs under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.141 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 28)
  `run_serial_binary ./CXX_test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/compilers/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/compilers/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./CXX_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Compilers] Fortran binary runs under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.139 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 37)
  `run_serial_binary ./F90_test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/compilers/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/compilers/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./F90_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Compilers] C openmp binary runs under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.134 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 46)
  `run_serial_binary ./C_openmp_test 8' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/compilers/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/compilers/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./C_openmp_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Compilers] C++ openmp binary runs under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.128 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 55)
  `run_serial_binary ./CXX_openmp_test 8' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/compilers/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/compilers/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./CXX_openmp_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Compilers] Fortran openmp binary runs under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.116 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 64)
  `run_serial_binary ./F90_openmp_test 8' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/compilers/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/compilers/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./F90_openmp_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: version_match

Results

Duration0.434 sec
Tests3
Failures0

Tests

version_match

Test case:[Compilers] compiler module loaded (gnu14)
Outcome:Passed
Duration:0.125 sec
FailedNone
None
Test case:[Compilers] compiler module version available (gnu14)
Outcome:Passed
Duration:0.129 sec
FailedNone
None
Test case:[Compilers] C, C++, and Fortran versions match module (gnu14)
Outcome:Passed
Duration:0.18 sec
FailedNone
None

Test Suite: test_autotools

Results

Duration5.087 sec
Tests4
Failures0

Tests

test_autotools

Test case:[/dev-tools/autotools] running autoreconf
Outcome:Passed
Duration:3.221 sec
FailedNone
None
Test case:[/dev-tools/autotools] run generated configure
Outcome:Passed
Duration:1.684 sec
FailedNone
None
Test case:[/dev-tools/autotools] run make on generated Makefile
Outcome:Passed
Duration:0.16 sec
FailedNone
None
Test case:[/dev-tools/autotools] run compiled binary
Outcome:Passed
Duration:0.022 sec
FailedNone
None

Test Suite: test_cmake

Results

Duration2.01 sec
Tests4
Failures0

Tests

test_cmake

Test case:[/dev-tools/cmake] running cmake --system-information
Outcome:Passed
Duration:0.754 sec
FailedNone
None
Test case:[/dev-tools/cmake] run cmake
Outcome:Passed
Duration:0.4 sec
FailedNone
None
Test case:[/dev-tools/cmake] run make on generated Makefile
Outcome:Passed
Duration:0.835 sec
FailedNone
None
Test case:[/dev-tools/cmake] run compiled binary
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: cuda

Results

Duration3.251 sec
Tests3
Failures1

Tests

cuda

Test case:[dev-tools/cuda] run nvidia-smi
Outcome:Passed
Duration:0.365 sec
FailedNone
None
Test case:[dev-tools/cuda] check for nvcc
Outcome:Passed
Duration:0.496 sec
FailedNone
None
Test case:[dev-tools/cuda] build cuda example and run it
Outcome:Failed
Duration:2.39 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file cuda, line 32)
  `run_serial_binary ./add_cuda' failed with status 2
nvcc warning : Support for offline compilation for architectures prior to '<compute/sm/lto>_75' will be removed in a future release (Use -Wno-deprecated-gpu-targets to suppress warning).
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/dev-tools/cuda': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/dev-tools/cuda': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./add_cuda: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: EasyBuild

Results

Duration11.28 sec
Tests3
Failures0

Tests

EasyBuild

Test case:[EasyBuild] check for RPM
Outcome:Passed
Duration:0.092 sec
FailedNone
None
Test case:[EasyBuild] test executable
Outcome:Passed
Duration:1.272 sec
FailedNone
None
Test case:[EasyBuild] quick test install of bzip2
Outcome:Passed
Duration:9.916 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.31 sec
Tests2
Failures1

Tests

rm_execution

Test case:[dev-tools/hwloc] lstopo runs under resource manager (slurm/gnu14)
Outcome:Passed
Duration:0.174 sec
FailedNone
None
Test case:[dev-tools/hwloc] hwloc_hello runs under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.136 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 28)
  `run_serial_binary ./hwloc_hello' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/dev-tools/hwloc/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/dev-tools/hwloc/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./hwloc_hello: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: test_module

Results

Duration0.287 sec
Tests7
Failures0

Tests

test_module

Test case:[HWLOC] Verify HWLOC module is loaded and matches rpm version (gnu14/)
Outcome:Passed
Duration:0.166 sec
FailedNone
None
Test case:[HWLOC] Verify module HWLOC_DIR is defined and exists (gnu14/)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[HWLOC] Verify module HWLOC_LIB is defined and exists (gnu14/)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[HWLOC] Verify dynamic library available in HWLOC_LIB (gnu14/)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[HWLOC] Verify static library is not present in HWLOC_LIB (gnu14/)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[HWLOC] Verify module HWLOC_INC is defined and exists (gnu14/)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[HWLOC] Verify header file is present in HWLOC_INC (gnu14/)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: hipy

Results

Duration1.211 sec
Tests1
Failures1

Tests

hipy

Test case:[dev-tools/py3-mpi4py] python hello world (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.211 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file hipy, line 35)
  `run_mpi_binary "${_python} helloworld.py" $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.2644
Batch job 29 submitted

Job 29 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.29.out: No such file or directory

Test Suite: test_module

Results

Duration0.06 sec
Tests3
Failures0

Tests

test_module

Test case:[mpi4py] Verify module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[mpi4py] Verify module MPI4PY_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[mpi4py] Verify PYTHONPATH is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.019 sec
FailedNone
None

Test Suite: hipy

Results

Duration1.218 sec
Tests1
Failures1

Tests

hipy

Test case:[dev-tools/py3-mpi4py] python hello world (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.218 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file hipy, line 35)
  `run_mpi_binary "${_python} helloworld.py" $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.23737
Batch job 30 submitted

Job 30 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.30.out: No such file or directory

Test Suite: test_module

Results

Duration0.063 sec
Tests3
Failures0

Tests

test_module

Test case:[mpi4py] Verify module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[mpi4py] Verify module MPI4PY_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[mpi4py] Verify PYTHONPATH is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: MM

Results

Duration1.325 sec
Tests1
Failures0

Tests

MM

Test case:[dev-tools/py3-numpy] Numpy Matrix Multiply
Outcome:Passed
Duration:1.325 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.084 sec
Tests4
Failures0

Tests

test_module

Test case:[Numpy] Verify NUMPY modules can be loaded and match rpm version (gnu14)
Outcome:Passed
Duration:0.018 sec
FailedNone
None
Test case:[Numpy] Verify module NUMPY_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[Numpy] Verify module NUMPY_BIN is defined and exists
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[Numpy] Verify PYTHONPATH is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: springs

Results

Duration0.344 sec
Tests1
Failures0

Tests

springs

Test case:[dev-tools/py3-scipy] Coupled Spring-Mass System (/gnu14/mpich)
Outcome:Passed
Duration:0.344 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.061 sec
Tests3
Failures0

Tests

test_module

Test case:[scipy] Verify SCIPY module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[scipy] Verify module SCIPY_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[scipy] Verify PYTHONPATH is defined and exists (gnu14)
Outcome:Passed
Duration:0.019 sec
FailedNone
None

Test Suite: springs

Results

Duration0.34 sec
Tests1
Failures0

Tests

springs

Test case:[dev-tools/py3-scipy] Coupled Spring-Mass System (/gnu14/openmpi5)
Outcome:Passed
Duration:0.34 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.062 sec
Tests3
Failures0

Tests

test_module

Test case:[scipy] Verify SCIPY module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[scipy] Verify module SCIPY_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[scipy] Verify PYTHONPATH is defined and exists (gnu14)
Outcome:Passed
Duration:0.019 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.455 sec
Tests2
Failures2

Tests

rm_execution

Test case:[Valgrind] Callgrind execution under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.225 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 28)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/dev-tools/valgrind/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/dev-tools/valgrind/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): valgrind: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--
Test case:[Valgrind] Memcheck execution under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.23 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 46)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/dev-tools/valgrind/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/dev-tools/valgrind/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): valgrind: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--

Test Suite: test_module

Results

Duration1.738 sec
Tests8
Failures0

Tests

test_module

Test case:[Valgrind] Verify valgrind module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.169 sec
FailedNone
None
Test case:[Valgrind] Verify module VALGRIND_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[Valgrind] Verify availability of valgrind binary (gnu14)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[Valgrind] Verify availability of man page (gnu14)
Outcome:Passed
Duration:0.058 sec
FailedNone
None
Test case:[Valgrind] Verify module VALGRIND_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[Valgrind] Verify header file is present in VALGRIND_INC (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[Valgrind] Callgrind compile/test (gnu14)
Outcome:Passed
Duration:0.367 sec
FailedNone
None
Test case:[Valgrind] Memcheck compile/test (gnu14)
Outcome:Passed
Duration:1.047 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.178 sec
Tests1
Failures1

Tests

rm_execution

Test case:[R] Running Rscript bench.R under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.178 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 28)
  `assert_success' failed
bench.R

-- command failed --
status : 2
output :
--

Test Suite: test_module

Results

Duration12.889 sec
Tests7
Failures0

Tests

test_module

Test case:[R] Verify R module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.225 sec
FailedNone
None
Test case:[R] Verify R_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[R] Verify availability of R executable -> R (gnu14)
Outcome:Passed
Duration:0.032 sec
FailedNone
None
Test case:[R] Verify availability of R executable -> Rscript (gnu14)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[R] Verify module R_SHARE is defined and exists (gnu14)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[R] Running bench.R test
Outcome:Passed
Duration:11.995 sec
FailedNone
None
Test case:[R] Verify ability to compile C code for R and execute
Outcome:Passed
Duration:0.559 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration3.514 sec
Tests2
Failures2

Tests

rm_execution

Test case:[libs/ADIOS2] MPI C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.223 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 24)
  `run_mpi_binary ./${binary} "" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.20924
Batch job 33 submitted

Job 33 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.33.out: No such file or directory
Test case:[libs/ADIOS2] MPI F90 binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.291 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 34)
  `run_mpi_binary ./${binary} "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.7038
Batch job 34 submitted

Job 34 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.34.out: No such file or directory

Test Suite: test_module

Results

Duration0.342 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/ADIOS2] Verify ADIOS2 module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.208 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_BIN is defined and exists
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify (dynamic) library available in ADIOS2_LIB (gnu14)
Outcome:Passed
Duration:0.025 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify header file is present in ADIOS2_INC (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration3.532 sec
Tests2
Failures2

Tests

rm_execution

Test case:[libs/ADIOS2] MPI C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.23 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 24)
  `run_mpi_binary ./${binary} "" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.16629
Batch job 35 submitted

Job 35 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.35.out: No such file or directory
Test case:[libs/ADIOS2] MPI F90 binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.302 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 34)
  `run_mpi_binary ./${binary} "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.13297
Batch job 36 submitted

Job 36 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.36.out: No such file or directory

Test Suite: test_module

Results

Duration0.343 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/ADIOS2] Verify ADIOS2 module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.207 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_BIN is defined and exists
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify (dynamic) library available in ADIOS2_LIB (gnu14)
Outcome:Passed
Duration:0.026 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify module ADIOS2_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/ADIOS2] Verify header file is present in ADIOS2_INC (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.524 sec
Tests4
Failures4

Tests

rm_execution

Test case:[Boost/Accumulators] min-test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.133 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 21)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/accumulators/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/accumulators/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./min: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Accumulators] max-test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.131 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 30)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/accumulators/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/accumulators/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./max: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Accumulators] skewness-test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.13 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 39)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/accumulators/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/accumulators/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./skewness: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Accumulators] variance-test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.13 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 48)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/accumulators/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/accumulators/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./variance: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: rm_execution

Results

Duration0.487 sec
Tests4
Failures4

Tests

rm_execution

Test case:[Boost/Accumulators] min-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.103 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 21)
  `run_serial_binary ./$test' failed with status 2
srun: error: c1: task 0: Exited with exit code 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/accumulators/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/accumulators/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./min: No such file or directory
Test case:[Boost/Accumulators] max-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.136 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 30)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/accumulators/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/accumulators/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./max: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Accumulators] skewness-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.121 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 39)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/accumulators/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/accumulators/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./skewness: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Accumulators] variance-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.127 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 48)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/accumulators/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/accumulators/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./variance: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: test_module

Results

Duration3.221 sec
Tests8
Failures0

Tests

test_module

Test case:[BOOST] Verify BOOST module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.204 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[BOOST] Verify dynamic library available in BOOST_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Verify static library is not present in BOOST_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Verify header file is present in BOOST_INC (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Test interactive build/exec of f_test.cpp (gnu14/mpich)
Outcome:Passed
Duration:2.89 sec
FailedNone
None

Test Suite: test_module

Results

Duration3.146 sec
Tests8
Failures0

Tests

test_module

Test case:[BOOST] Verify BOOST module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.21 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Verify dynamic library available in BOOST_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[BOOST] Verify static library is not present in BOOST_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[BOOST] Verify header file is present in BOOST_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[BOOST] Test interactive build/exec of f_test.cpp (gnu14/openmpi5)
Outcome:Passed
Duration:2.804 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.503 sec
Tests4
Failures4

Tests

rm_execution

Test case:[Boost/Multi Array] access-test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.127 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 21)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/multi_array/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/multi_array/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./access: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Multi Array] iterators-test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.121 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 30)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/multi_array/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/multi_array/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./iterators: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Multi Array] resize-test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.131 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 39)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/multi_array/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/multi_array/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./resize: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Multi Array] idxgen1-test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.124 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 48)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/multi_array/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/multi_array/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./idxgen1: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: rm_execution

Results

Duration0.502 sec
Tests4
Failures4

Tests

rm_execution

Test case:[Boost/Multi Array] access-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.12 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 21)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/multi_array/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/multi_array/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./access: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Multi Array] iterators-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.129 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 30)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/multi_array/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/multi_array/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./iterators: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Multi Array] resize-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.124 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 39)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/multi_array/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/multi_array/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./resize: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Multi Array] idxgen1-test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.129 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 48)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/multi_array/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/multi_array/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./idxgen1: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: rm_execution

Results

Duration0.222 sec
Tests1
Failures1

Tests

rm_execution

Test case:[Boost/ublas] bench1_ublas binary under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.222 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 20)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/numeric/test': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/numeric/test': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./bench1_ublas: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--

Test Suite: rm_execution

Results

Duration0.219 sec
Tests1
Failures1

Tests

rm_execution

Test case:[Boost/ublas] bench1_ublas binary under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.219 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 20)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/numeric/test': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/numeric/test': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./bench1_ublas: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--

Test Suite: rm_execution

Results

Duration1.166 sec
Tests11
Failures9

Tests

rm_execution

Test case:[Boost/Program Options] cmdline_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.132 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 21)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./cmdline_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Program Options] exception_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.12 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 30)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./exception_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Program Options] options_description_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.127 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 39)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./options_description_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Program Options] parsers_test on master host(gnu14/mpich)
Outcome:Passed
Duration:0.03 sec
FailedNone
None
Test case:[Boost/Program Options] parsers_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.123 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 57)
  `run_serial_binary ./$test config_test.cfg ' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./parsers_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Program Options] positional_options_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.125 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 66)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./positional_options_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Program Options] required_test on master host(gnu14/mpich)
Outcome:Passed
Duration:0.029 sec
FailedNone
None
Test case:[Boost/Program Options] required_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.118 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 84)
  `run_serial_binary ./$test required_test.cfg ' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./required_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Program Options] unicode_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.119 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 93)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./unicode_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Program Options] unrecognized_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.119 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 102)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./unrecognized_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Program Options] variable_map_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.124 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 111)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./variable_map_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: rm_execution

Results

Duration1.211 sec
Tests11
Failures9

Tests

rm_execution

Test case:[Boost/Program Options] cmdline_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.121 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 21)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./cmdline_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Program Options] exception_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.125 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 30)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./exception_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Program Options] options_description_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.127 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 39)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./options_description_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Program Options] parsers_test on master host(gnu14/openmpi5)
Outcome:Passed
Duration:0.03 sec
FailedNone
None
Test case:[Boost/Program Options] parsers_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.121 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 57)
  `run_serial_binary ./$test config_test.cfg ' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./parsers_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Program Options] positional_options_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.135 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 66)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./positional_options_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Program Options] required_test on master host(gnu14/openmpi5)
Outcome:Passed
Duration:0.029 sec
FailedNone
None
Test case:[Boost/Program Options] required_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.137 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 84)
  `run_serial_binary ./$test required_test.cfg ' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./required_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Program Options] unicode_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.123 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 93)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./unicode_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Program Options] unrecognized_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.136 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 102)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./unrecognized_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Program Options] variable_map_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.127 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 111)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/program_options/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./variable_map_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: rm_execution

Results

Duration0.534 sec
Tests4
Failures4

Tests

rm_execution

Test case:[Boost/Random] test_piecewise_linear under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.129 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 21)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/random/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/random/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test_piecewise_linear: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Random] test_discrete under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.143 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 30)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/random/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/random/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test_discrete: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Random] test_random_device under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.137 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 39)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/random/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/random/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test_random_device: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Random] test_random_number_generator under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.125 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 48)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/random/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/random/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test_random_number_generator: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: rm_execution

Results

Duration0.512 sec
Tests4
Failures4

Tests

rm_execution

Test case:[Boost/Random] test_piecewise_linear under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.134 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 21)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/random/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/random/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test_piecewise_linear: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Random] test_discrete under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.125 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 30)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/random/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/random/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test_discrete: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Random] test_random_device under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.126 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 39)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/random/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/random/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test_random_device: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost/Random] test_random_number_generator under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.127 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 48)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/random/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/random/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test_random_number_generator: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: rm_execution

Results

Duration0.119 sec
Tests1
Failures1

Tests

rm_execution

Test case:[Boost] regress under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.119 sec
FailedNone
(from function `run_serial_binary' in file ../../../../../../common/functions, line 236,
 in test file rm_execution, line 21)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/regex/test/regress': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/regex/test/regress': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./regress: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: rm_execution

Results

Duration0.369 sec
Tests3
Failures3

Tests

rm_execution

Test case:[Boost] bad_expression_test under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.111 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 21)
  `run_serial_binary ./$test' failed with status 2
srun: error: c1: task 0: Exited with exit code 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/regex/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/regex/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./bad_expression_test: No such file or directory
Test case:[Boost] named_subexpressions_test under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.131 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 30)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/regex/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/regex/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./named_subexpressions_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost] recursion_test under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.127 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 39)
  `run_serial_binary ./$test config_test.cfg ' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/regex/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/regex/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./recursion_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: rm_execution

Results

Duration0.368 sec
Tests3
Failures3

Tests

rm_execution

Test case:[Boost] bad_expression_test under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.123 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 21)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/regex/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/regex/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./bad_expression_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost] named_subexpressions_test under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.122 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 30)
  `run_serial_binary ./$test' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/regex/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/regex/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./named_subexpressions_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[Boost] recursion_test under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.123 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 39)
  `run_serial_binary ./$test config_test.cfg ' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/regex/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/boost/tests/regex/test': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./recursion_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: test_module

Results

Duration0.336 sec
Tests7
Failures0

Tests

test_module

Test case:[BOOST] Verify BOOST module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.209 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Verify dynamic library available in BOOST_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Verify static library is not present in BOOST_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Verify header file is present in BOOST_INC (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.345 sec
Tests7
Failures0

Tests

test_module

Test case:[BOOST] Verify BOOST module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.216 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Verify dynamic library available in BOOST_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Verify static library is not present in BOOST_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[BOOST] Verify module BOOST_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[BOOST] Verify header file is present in BOOST_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration16.866 sec
Tests6
Failures6

Tests

rm_execution

Test case:[Boost/MPI] all_gather_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.21 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 21)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.18512
Batch job 89 submitted

Job 89 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.89.out: No such file or directory
Test case:[Boost/MPI] all_reduce_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.276 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 30)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.32192
Batch job 90 submitted

Job 90 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.90.out: No such file or directory
Test case:[Boost/MPI] all_to_all_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:4.415 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 39)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.21831
Batch job 91 submitted

Job 91 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.91.out: No such file or directory
Test case:[Boost/MPI] groups_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.346 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 48)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.26091
Batch job 92 submitted

Job 92 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.92.out: No such file or directory
Test case:[Boost/MPI] ring_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.213 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 66)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.10557
Batch job 93 submitted

Job 93 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.93.out: No such file or directory
Test case:[Boost/MPI] pointer_test under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:4.406 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 75)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.24145
Batch job 94 submitted

Job 94 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.94.out: No such file or directory

Test Suite: rm_execution

Results

Duration11.568 sec
Tests6
Failures6

Tests

rm_execution

Test case:[Boost/MPI] all_gather_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.21 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 21)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.20572
Batch job 95 submitted

Job 95 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.95.out: No such file or directory
Test case:[Boost/MPI] all_reduce_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.281 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 30)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.18908
Batch job 96 submitted

Job 96 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.96.out: No such file or directory
Test case:[Boost/MPI] all_to_all_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.286 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 39)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.22285
Batch job 97 submitted

Job 97 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.97.out: No such file or directory
Test case:[Boost/MPI] groups_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.289 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 48)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.7809
Batch job 98 submitted

Job 98 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.98.out: No such file or directory
Test case:[Boost/MPI] ring_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.216 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 66)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.10044
Batch job 99 submitted

Job 99 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.99.out: No such file or directory
Test case:[Boost/MPI] pointer_test under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.286 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 75)
  `run_mpi_binary ./$test atest 2 16' failed
job script = /tmp/job.ohpc-test.24595
Batch job 100 submitted

Job 100 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.100.out: No such file or directory

Test Suite: test_module

Results

Duration0.305 sec
Tests7
Failures0

Tests

test_module

Test case:[FFTW] Verify FFTW module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.178 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[FFTW] Verify dynamic library available in FFTW_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[FFTW] Verify static library is not present in FFTW_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[FFTW] Verify header file is present in FFTW_INC (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.314 sec
Tests7
Failures0

Tests

test_module

Test case:[FFTW] Verify FFTW module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.186 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[FFTW] Verify dynamic library available in FFTW_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[FFTW] Verify static library is not present in FFTW_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[FFTW] Verify module FFTW_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[FFTW] Verify header file is present in FFTW_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.738 sec
Tests3
Failures3

Tests

rm_execution

Test case:[libs/FFTW] Serial C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.23 sec
FailedNone
(from function `assert_success' in file ../../../common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 26)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/fftw/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/fftw/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./C_test: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--
Test case:[libs/FFTW] MPI C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.281 sec
FailedNone
(from function `assert_success' in file ../../../common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 40)
  `assert_success' failed

-- command failed --
status : 1
output (8 lines):
  job script = /tmp/job.ohpc-test.987
  Batch job 102 submitted

  Job 102 failed...
  Reason=RaisedSignal:53(Real-time_signal_19)

  cat: job.102.out: No such file or directory
  grep: job.102.out: No such file or directory
--
Test case:[libs/FFTW] Serial Fortran binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.227 sec
FailedNone
(from function `assert_success' in file ../../../common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 49)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/fftw/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/fftw/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./F_test: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--

Test Suite: rm_execution

Results

Duration1.743 sec
Tests3
Failures3

Tests

rm_execution

Test case:[libs/FFTW] Serial C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.236 sec
FailedNone
(from function `assert_success' in file ../../../common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 26)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/fftw/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/fftw/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./C_test: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--
Test case:[libs/FFTW] MPI C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.289 sec
FailedNone
(from function `assert_success' in file ../../../common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 40)
  `assert_success' failed

-- command failed --
status : 1
output (8 lines):
  job script = /tmp/job.ohpc-test.20921
  Batch job 105 submitted

  Job 105 failed...
  Reason=RaisedSignal:53(Real-time_signal_19)

  cat: job.105.out: No such file or directory
  grep: job.105.out: No such file or directory
--
Test case:[libs/FFTW] Serial Fortran binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.218 sec
FailedNone
(from function `assert_success' in file ../../../common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 49)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/fftw/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/fftw/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./F_test: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--

Test Suite: rm_execution

Results

Duration6.19 sec
Tests50
Failures49

Tests

rm_execution

Test case:[libs/GSL] run test_gsl_histogram (gnu14)
Outcome:Passed
Duration:0.042 sec
FailedNone
None
Test case:[libs/GSL] run block under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.141 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 33)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/block ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/block': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/block': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run bspline under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.121 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 43)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/bspline ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/bspline': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/bspline': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run cblas under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.124 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 53)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/cblas ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/cblas': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/cblas': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run cdf under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.125 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 63)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/cdf ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/cdf': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/cdf': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run cheb under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.13 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 73)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/cheb ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/cheb': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/cheb': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run combination under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.124 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 83)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/combination ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/combination': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/combination': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run complex under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.118 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 93)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/complex ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/complex': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/complex': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run const under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.13 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 103)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/const ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/const': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/const': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run deriv under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.119 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 113)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/deriv ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/deriv': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/deriv': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run dht under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.131 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 123)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/dht ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/dht': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/dht': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run diff under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.124 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 133)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/diff ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/diff': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/diff': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run eigen under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.135 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 147)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/eigen ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/eigen': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/eigen': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run err under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.123 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 157)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/err ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/err': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/err': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run fft under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.123 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 167)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/fft ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/fft': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/fft': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run fit under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.122 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 177)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/fit ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/fit': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/fit': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run histogram under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.128 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 187)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/histogram ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/histogram': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/histogram': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run ieee-utils under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.124 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 197)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/ieee-utils ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/ieee-utils': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/ieee-utils': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run integration under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.15 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 207)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/integration ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/integration': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/integration': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run interpolation under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.121 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 217)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/interpolation ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/interpolation': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/interpolation': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run linalg under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.124 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 227)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/linalg ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/linalg': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/linalg': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run matrix under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.114 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 237)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/matrix ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/matrix': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/matrix': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run min under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.122 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 252)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/min ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/min': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/min': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run monte under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.13 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 262)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/monte ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/monte': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/monte': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run multifit under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.121 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 277)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/multifit ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/multifit': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/multifit': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run multilarge under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.127 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 288)
  `run_serial_binary -t "00:03:00" ./$tx1' failed with status 2
~/tests/libs/gsl/tests/multilarge ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/multilarge': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/multilarge': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run multimin under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.122 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 298)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/multimin ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/multimin': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/multimin': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run multiroots under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.125 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 308)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/multiroots ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/multiroots': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/multiroots': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run multiset under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.122 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 318)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/multiset ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/multiset': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/multiset': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run ntuple under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.121 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 328)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/ntuple ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/ntuple': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/ntuple': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run ode-initval under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.121 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 338)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/ode-initval ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/ode-initval': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/ode-initval': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run ode-initval2 under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.126 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 348)
  `run_serial_binary -t "00:02:00" ./$tx1' failed with status 2
~/tests/libs/gsl/tests/ode-initval2 ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/ode-initval2': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/ode-initval2': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run permutation under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.132 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 358)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/permutation ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/permutation': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/permutation': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run poly under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.125 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 368)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/poly ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/poly': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/poly': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run qrng under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.122 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 378)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/qrng ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/qrng': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/qrng': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run randist under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.125 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 388)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/randist ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/randist': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/randist': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run rng under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.124 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 398)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/rng ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/rng': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/rng': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run roots under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.131 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 408)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/roots ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/roots': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/roots': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run rstat under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.12 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 418)
  `run_serial_binary -t "00:02:00" ./$tx1' failed with status 2
~/tests/libs/gsl/tests/rstat ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/rstat': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/rstat': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run siman under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.127 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 428)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/siman ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/siman': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/siman': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run sort under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.124 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 438)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/sort ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/sort': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/sort': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run spblas under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.125 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 448)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/spblas ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/spblas': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/spblas': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run specfunc under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.122 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 463)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/specfunc ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/specfunc': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/specfunc': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run splinalg under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.132 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 473)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/splinalg ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/splinalg': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/splinalg': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run spmatrix under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.12 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 483)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/spmatrix ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/spmatrix': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/spmatrix': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run statistics under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.123 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 493)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/statistics ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/statistics': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/statistics': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run sum under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.123 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 503)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/sum ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/sum': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/sum': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run sys under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.124 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 513)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/sys ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/sys': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/sys': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run vector under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.147 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 523)
  `run_serial_binary -t "00:03:00" ./$tx1' failed with status 2
~/tests/libs/gsl/tests/vector ~/tests/libs/gsl/tests
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/vector': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/vector': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/GSL] run wavelet under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.114 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 538)
  `run_serial_binary ./$tx1' failed with status 2
~/tests/libs/gsl/tests/wavelet ~/tests/libs/gsl/tests
srun: error: c1: task 0: Exited with exit code 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/wavelet': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/gsl/tests/wavelet': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./test: No such file or directory

Test Suite: test_module

Results

Duration0.294 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/GSL] Verify GSL module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.17 sec
FailedNone
None
Test case:[libs/GSL] Verify module GSL_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/GSL] Verify module GSL_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/GSL] Verify dynamic library available in GSL_LIB (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/GSL] Verify static library is not present in GSL_LIB (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/GSL] Verify module GSL_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/GSL] Verify header file is present in GSL_INC (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration23.874 sec
Tests12
Failures9

Tests

rm_execution

Test case:[libs/HYPRE] 2 PE structured test binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.224 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 23)
  `run_mpi_binary ./ex1 $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.19233
Batch job 167 submitted

Job 167 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.167.out: No such file or directory
Test case:[libs/HYPRE] 2 PE PCG with SMG preconditioner binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.367 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 32)
  `run_mpi_binary ./ex2 $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.22714
Batch job 168 submitted

Job 168 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.168.out: No such file or directory
Test case:[libs/HYPRE] 2 PE Semi-Structured PCG binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.297 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 41)
  `run_mpi_binary ./ex6 "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.2086
Batch job 169 submitted

Job 169 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.169.out: No such file or directory
Test case:[libs/HYPRE] 2 PE Three-part stencil binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.365 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 50)
  `run_mpi_binary ./ex8 "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.28799
Batch job 170 submitted

Job 170 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.170.out: No such file or directory
Test case:[libs/HYPRE] 2 PE FORTRAN PCG with PFMG preconditioner binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.298 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 59)
  `run_mpi_binary ./ex12f "" 1 2' failed
job script = /tmp/job.ohpc-test.19652
Batch job 171 submitted

Job 171 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.171.out: No such file or directory
Test case:[libs/HYPRE] 2 PE -Delta u = 1 binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.358 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 68)
  `run_mpi_binary ./ex3 "-n 33 -solver 0 -v 1 1" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.20577
Batch job 172 submitted

Job 172 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.172.out: No such file or directory
Test case:[libs/HYPRE] 2 PE convection-reaction-diffusion binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.367 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 77)
  `run_mpi_binary ./ex4 "-n 33 -solver 10  -K 3 -B 0 -C 1 -U0 2 -F 4" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.19195
Batch job 173 submitted

Job 173 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.173.out: No such file or directory
Test case:[libs/HYPRE] 2 PE FORTRAN 2-D Laplacian binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.296 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 86)
  `run_mpi_binary ./ex5f "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.15325
Batch job 174 submitted

Job 174 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.174.out: No such file or directory
Test case:[libs/HYPRE] 2 PE Semi-Structured convection binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[libs/HYPRE] 2 PE biharmonic binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[libs/HYPRE] 2 PE C++ Finite Element Interface binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
C++ example depends on non-installed header
Test case:[libs/HYPRE] 2 PE 2-D Laplacian eigenvalue binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.302 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 125)
  `run_mpi_binary ./ex11 "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.15251
Batch job 175 submitted

Job 175 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.175.out: No such file or directory

Test Suite: test_module

Results

Duration1.586 sec
Tests9
Failures1

Tests

test_module

Test case:[libs/HYPRE] Verify HYPRE module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.204 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/HYPRE] Verify dynamic library available in HYPRE_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/HYPRE] Verify static library is not present in HYPRE_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/HYPRE] Verify header file is present in HYPRE_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/HYPRE] Sample job (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.229 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file test_module, line 131)
  `run_mpi_binary ./ex1 "atest" 1 1' failed
job script = /tmp/job.ohpc-test.14127
Batch job 166 submitted

Job 166 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.166.out: No such file or directory

Test Suite: rm_execution

Results

Duration24.867 sec
Tests12
Failures9

Tests

rm_execution

Test case:[libs/HYPRE] 2 PE structured test binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.287 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 23)
  `run_mpi_binary ./ex1 $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.21303
Batch job 157 submitted

Job 157 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.157.out: No such file or directory
Test case:[libs/HYPRE] 2 PE PCG with SMG preconditioner binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.355 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 32)
  `run_mpi_binary ./ex2 $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.10021
Batch job 158 submitted

Job 158 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.158.out: No such file or directory
Test case:[libs/HYPRE] 2 PE Semi-Structured PCG binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.289 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 41)
  `run_mpi_binary ./ex6 "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.8811
Batch job 159 submitted

Job 159 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.159.out: No such file or directory
Test case:[libs/HYPRE] 2 PE Three-part stencil binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.356 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 50)
  `run_mpi_binary ./ex8 "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.18962
Batch job 160 submitted

Job 160 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.160.out: No such file or directory
Test case:[libs/HYPRE] 2 PE FORTRAN PCG with PFMG preconditioner binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.287 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 59)
  `run_mpi_binary ./ex12f "" 1 2' failed
job script = /tmp/job.ohpc-test.18761
Batch job 161 submitted

Job 161 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.161.out: No such file or directory
Test case:[libs/HYPRE] 2 PE -Delta u = 1 binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.286 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 68)
  `run_mpi_binary ./ex3 "-n 33 -solver 0 -v 1 1" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.9269
Batch job 162 submitted

Job 162 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.162.out: No such file or directory
Test case:[libs/HYPRE] 2 PE convection-reaction-diffusion binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.359 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 77)
  `run_mpi_binary ./ex4 "-n 33 -solver 10  -K 3 -B 0 -C 1 -U0 2 -F 4" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.311
Batch job 163 submitted

Job 163 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.163.out: No such file or directory
Test case:[libs/HYPRE] 2 PE FORTRAN 2-D Laplacian binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.36 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 86)
  `run_mpi_binary ./ex5f "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.13336
Batch job 164 submitted

Job 164 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.164.out: No such file or directory
Test case:[libs/HYPRE] 2 PE Semi-Structured convection binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[libs/HYPRE] 2 PE biharmonic binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[libs/HYPRE] 2 PE C++ Finite Element Interface binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
C++ example depends on non-installed header
Test case:[libs/HYPRE] 2 PE 2-D Laplacian eigenvalue binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.288 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 125)
  `run_mpi_binary ./ex11 "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.5666
Batch job 165 submitted

Job 165 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.165.out: No such file or directory

Test Suite: test_module

Results

Duration1.569 sec
Tests9
Failures1

Tests

test_module

Test case:[libs/HYPRE] Verify HYPRE module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.197 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/HYPRE] Verify dynamic library available in HYPRE_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/HYPRE] Verify static library is not present in HYPRE_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/HYPRE] Verify module HYPRE_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/HYPRE] Verify header file is present in HYPRE_INC (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/HYPRE] Sample job (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.222 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file test_module, line 131)
  `run_mpi_binary ./ex1 "atest" 1 1' failed
job script = /tmp/job.ohpc-test.3485
Batch job 156 submitted

Job 156 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.156.out: No such file or directory

Test Suite: test_metis

Results

Duration3.396 sec
Tests4
Failures1

Tests

test_metis

Test case:[libs/Metis] Graph partition (gnu14)
Outcome:Passed
Duration:0.349 sec
FailedNone
None
Test case:[libs/Metis] Fill-reducing ordering (gnu14)
Outcome:Passed
Duration:2.743 sec
FailedNone
None
Test case:[libs/Metis] Mesh to graph conversion (gnu14)
Outcome:Passed
Duration:0.06 sec
FailedNone
None
Test case:[libs/Metis] C API mesh partitioning (slurm/gnu14)
Outcome:Failed
Duration:0.244 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file test_metis, line 45)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/metis/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/metis/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./C_test: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--

Test Suite: test_module

Results

Duration0.369 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/Metis] Verify METIS module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.178 sec
FailedNone
None
Test case:[libs/Metis] Verify module METIS_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/Metis] Verify module METIS_BIN is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/Metis] Verify availability of m2gmetis binary (gnu14)
Outcome:Passed
Duration:0.047 sec
FailedNone
None
Test case:[libs/Metis] Verify module METIS_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.019 sec
FailedNone
None
Test case:[libs/Metis] Verify dynamic library available in METIS_LIB (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/Metis] Verify static library is not present in METIS_LIB (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/Metis] Verify module METIS_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/Metis] Verify header file is present in METIS_INC (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration11.479 sec
Tests4
Failures4

Tests

rm_execution

Test case:[libs/MFEM] p_laplace MPI binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.33 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 24)
  `run_mpi_binary ./${binary} "-no-vis" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.4406
Batch job 176 submitted

Job 176 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.176.out: No such file or directory
Test case:[libs/MFEM] p_laplace_perf MPI binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:4.488 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 34)
  `run_mpi_binary ./${binary} "-no-vis -rs 2" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.3414
Batch job 177 submitted

Job 177 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.177.out: No such file or directory
Test case:[libs/MFEM] p_cantilever MPI binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.25 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 44)
  `run_mpi_binary ./${binary} "-no-vis" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.31352
Batch job 178 submitted

Job 178 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.178.out: No such file or directory
Test case:[libs/MFEM] p_diffusion MPI binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.411 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 54)
  `run_mpi_binary ./${binary} "-no-vis" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.1622
Batch job 179 submitted

Job 179 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.179.out: No such file or directory

Test Suite: rm_execution

Results

Duration10.426 sec
Tests4
Failures4

Tests

rm_execution

Test case:[libs/MFEM] p_laplace MPI binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.259 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 24)
  `run_mpi_binary ./${binary} "-no-vis" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.29682
Batch job 180 submitted

Job 180 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.180.out: No such file or directory
Test case:[libs/MFEM] p_laplace_perf MPI binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.421 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 34)
  `run_mpi_binary ./${binary} "-no-vis -rs 2" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.30478
Batch job 181 submitted

Job 181 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.181.out: No such file or directory
Test case:[libs/MFEM] p_cantilever MPI binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.33 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 44)
  `run_mpi_binary ./${binary} "-no-vis" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.32138
Batch job 182 submitted

Job 182 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.182.out: No such file or directory
Test case:[libs/MFEM] p_diffusion MPI binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.416 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 54)
  `run_mpi_binary ./${binary} "-no-vis" $NODES $TASKS ' failed
job script = /tmp/job.ohpc-test.23229
Batch job 183 submitted

Job 183 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.183.out: No such file or directory

Test Suite: test_module

Results

Duration0.354 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/MFEM] Verify MFEM module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.218 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.025 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/MFEM] Verify (dynamic) library available in MFEM_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/MFEM] Verify header file is present in MFEM_INC (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.363 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/MFEM] Verify MFEM module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.224 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.026 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/MFEM] Verify (dynamic) library available in MFEM_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/MFEM] Verify module MFEM_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/MFEM] Verify header file is present in MFEM_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration13.589 sec
Tests5
Failures5

Tests

rm_execution

Test case:[libs/Mumps] C (double precision) runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.221 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 26)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.24851
Batch job 185 submitted

Job 185 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.185.out: No such file or directory
Test case:[libs/Mumps] Fortran (single precision) runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.357 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 36)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.4603
Batch job 186 submitted

Job 186 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.186.out: No such file or directory
Test case:[libs/Mumps] Fortran (double precision) runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.22 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 46)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.24802
Batch job 187 submitted

Job 187 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.187.out: No such file or directory
Test case:[libs/Mumps] Fortran (complex) runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:4.43 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 56)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.1647
Batch job 188 submitted

Job 188 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.188.out: No such file or directory
Test case:[libs/Mumps] Fortran (double complex) runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.361 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 66)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.20638
Batch job 189 submitted

Job 189 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.189.out: No such file or directory

Test Suite: test_module

Results

Duration0.214 sec
Tests2
Failures0

Tests

test_module

Test case:[libs/MUMPS] Verify MUMPS module is loaded and matches rpm version (gnu14-mpich)
Outcome:Passed
Duration:0.193 sec
FailedNone
None
Test case:[libs/MUMPS] Verify MUMPS_DIR is defined and directory exists (gnu14-mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration11.489 sec
Tests5
Failures5

Tests

rm_execution

Test case:[libs/Mumps] C (double precision) runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.222 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 26)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.16649
Batch job 190 submitted

Job 190 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.190.out: No such file or directory
Test case:[libs/Mumps] Fortran (single precision) runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.299 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 36)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.7550
Batch job 191 submitted

Job 191 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.191.out: No such file or directory
Test case:[libs/Mumps] Fortran (double precision) runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.302 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 46)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.9344
Batch job 192 submitted

Job 192 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.192.out: No such file or directory
Test case:[libs/Mumps] Fortran (complex) runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.371 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 56)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.20682
Batch job 193 submitted

Job 193 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.193.out: No such file or directory
Test case:[libs/Mumps] Fortran (double complex) runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.295 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 66)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.31167
Batch job 194 submitted

Job 194 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.194.out: No such file or directory

Test Suite: test_module

Results

Duration0.221 sec
Tests2
Failures0

Tests

test_module

Test case:[libs/MUMPS] Verify MUMPS module is loaded and matches rpm version (gnu14-openmpi5)
Outcome:Passed
Duration:0.199 sec
FailedNone
None
Test case:[libs/MUMPS] Verify MUMPS_DIR is defined and directory exists (gnu14-openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None

Test Suite: test_C_module

Results

Duration0.399 sec
Tests9
Failures0

Tests

test_C_module

Test case:[libs/NetCDF] Verify NETCDF module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.195 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF] Verify availability of nc-config binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.051 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF] Verify dynamic library available in NETCDF_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF] Verify static library is not present in NETCDF_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF] Verify header file is present in NETCDF_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None

Test Suite: test_CXX_module

Results

Duration0.402 sec
Tests9
Failures0

Tests

test_CXX_module

Test case:[libs/NetCDF-CXX] Verify NETCDF_CXX module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.195 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify availability of ncxx4-config binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.051 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify dynamic library available in NETCDF_CXX_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify static library is not present in NETCDF_CXX_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify header file is present in NETCDF_CXX_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None

Test Suite: test_Fortran_module

Results

Duration0.404 sec
Tests9
Failures0

Tests

test_Fortran_module

Test case:[libs/NetCDF-Fortran] Verify NETCDF_FORTRAN module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.197 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify availability of nf-config binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.051 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify dynamic library available in NETCDF_FORTRAN_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify static library is not present in NETCDF_FORTRAN_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify header file is present in NETCDF_FORTRAN_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: test_netcdf

Results

Duration1.201 sec
Tests7
Failures3

Tests

test_netcdf

Test case:[libs/NetCDF] ncdump availability (gnu14/openmpi5)
Outcome:Passed
Duration:0.225 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4/hdf5 available for C interface (gnu14/openmpi5)
Outcome:Passed
Duration:0.062 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for Fortran interface (gnu14/openmpi5)
Outcome:Passed
Duration:0.041 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for C++ interface (gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
option no longer supported
Test case:[libs/NetCDF] C write/read (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.419 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file test_netcdf, line 64)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/netcdf/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/netcdf/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./C_write: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--
Test case:[libs/NetCDF] Fortran write/read (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.229 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file test_netcdf, line 105)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/netcdf/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/netcdf/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./F90_write: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--
Test case:[libs/NetCDF] C++ write/read (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.225 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file test_netcdf, line 147)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/netcdf/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/netcdf/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./CXX_write: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--

Test Suite: test_C_module

Results

Duration0.384 sec
Tests9
Failures0

Tests

test_C_module

Test case:[libs/NetCDF] Verify NETCDF module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.185 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_BIN is defined and exists (gnu14)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF] Verify availability of nc-config binary (gnu14)
Outcome:Passed
Duration:0.05 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/NetCDF] Verify dynamic library available in NETCDF_LIB (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF] Verify static library is not present in NETCDF_LIB (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF] Verify header file is present in NETCDF_INC (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: test_CXX_module

Results

Duration0.388 sec
Tests9
Failures0

Tests

test_CXX_module

Test case:[libs/NetCDF-CXX] Verify NETCDF_CXX module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.187 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_BIN is defined and exists (gnu14)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify availability of ncxx4-config binary (gnu14)
Outcome:Passed
Duration:0.05 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify dynamic library available in NETCDF_CXX_LIB (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify static library is not present in NETCDF_CXX_LIB (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify header file is present in NETCDF_CXX_INC (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: test_Fortran_module

Results

Duration0.394 sec
Tests9
Failures0

Tests

test_Fortran_module

Test case:[libs/NetCDF-Fortran] Verify NETCDF_FORTRAN module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.189 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_BIN is defined and exists (gnu14)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify availability of nf-config binary (gnu14)
Outcome:Passed
Duration:0.051 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify dynamic library available in NETCDF_FORTRAN_LIB (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify static library is not present in NETCDF_FORTRAN_LIB (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify header file is present in NETCDF_FORTRAN_INC (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: test_netcdf

Results

Duration0.892 sec
Tests7
Failures3

Tests

test_netcdf

Test case:[libs/NetCDF] ncdump availability (gnu14)
Outcome:Passed
Duration:0.067 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4/hdf5 available for C interface (gnu14)
Outcome:Passed
Duration:0.06 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for Fortran interface (gnu14)
Outcome:Passed
Duration:0.039 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for C++ interface (gnu14)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
option no longer supported
Test case:[libs/NetCDF] C write/read (slurm/gnu14)
Outcome:Failed
Duration:0.266 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file test_netcdf, line 64)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/netcdf/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/netcdf/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./C_write: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--
Test case:[libs/NetCDF] Fortran write/read (slurm/gnu14)
Outcome:Failed
Duration:0.229 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file test_netcdf, line 105)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/netcdf/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/netcdf/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./F90_write: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--
Test case:[libs/NetCDF] C++ write/read (slurm/gnu14)
Outcome:Failed
Duration:0.231 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file test_netcdf, line 147)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/netcdf/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/netcdf/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./CXX_write: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--

Test Suite: test_C_module

Results

Duration0.394 sec
Tests9
Failures0

Tests

test_C_module

Test case:[libs/NetCDF] Verify NETCDF module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.187 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/NetCDF] Verify availability of nc-config binary (gnu14/mpich)
Outcome:Passed
Duration:0.052 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF] Verify dynamic library available in NETCDF_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF] Verify static library is not present in NETCDF_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF] Verify module NETCDF_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF] Verify header file is present in NETCDF_INC (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: test_CXX_module

Results

Duration0.401 sec
Tests9
Failures0

Tests

test_CXX_module

Test case:[libs/NetCDF-CXX] Verify NETCDF_CXX module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.199 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify availability of ncxx4-config binary (gnu14/mpich)
Outcome:Passed
Duration:0.05 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify dynamic library available in NETCDF_CXX_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify static library is not present in NETCDF_CXX_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify module NETCDF_CXX_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-CXX] Verify header file is present in NETCDF_CXX_INC (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: test_Fortran_module

Results

Duration0.404 sec
Tests9
Failures0

Tests

test_Fortran_module

Test case:[libs/NetCDF-Fortran] Verify NETCDF_FORTRAN module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.197 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify availability of nf-config binary (gnu14/mpich)
Outcome:Passed
Duration:0.051 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify dynamic library available in NETCDF_FORTRAN_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify static library is not present in NETCDF_FORTRAN_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify module NETCDF_FORTRAN_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/NetCDF-Fortran] Verify header file is present in NETCDF_FORTRAN_INC (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None

Test Suite: test_netcdf

Results

Duration1.121 sec
Tests7
Failures3

Tests

test_netcdf

Test case:[libs/NetCDF] ncdump availability (gnu14/mpich)
Outcome:Passed
Duration:0.183 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4/hdf5 available for C interface (gnu14/mpich)
Outcome:Passed
Duration:0.06 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for Fortran interface (gnu14/mpich)
Outcome:Passed
Duration:0.04 sec
FailedNone
None
Test case:[libs/NetCDF] verify nc4 available for C++ interface (gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
option no longer supported
Test case:[libs/NetCDF] C write/read (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.375 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file test_netcdf, line 64)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/netcdf/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/netcdf/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./C_write: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--
Test case:[libs/NetCDF] Fortran write/read (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.23 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file test_netcdf, line 105)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/netcdf/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/netcdf/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./F90_write: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--
Test case:[libs/NetCDF] C++ write/read (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.233 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file test_netcdf, line 147)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/netcdf/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/netcdf/tests': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./CXX_write: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--

Test Suite: test_pnetcdf

Results

Duration2.463 sec
Tests2
Failures2

Tests

test_pnetcdf

Test case:[libs/NetCDF] C parallel I/O (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.231 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file test_pnetcdf, line 28)
  `run_mpi_binary ./C_parallel "atest" 2 4' failed
job script = /tmp/job.ohpc-test.19529
Batch job 203 submitted

Job 203 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.203.out: No such file or directory
Test case:[libs/NetCDF] Fortran parallel I/O (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.232 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file test_pnetcdf, line 64)
  `run_mpi_binary -t "00:02:00" ./F90_parallel "atest" 2 4' failed
job script = /tmp/job.ohpc-test.31234
Batch job 204 submitted

Job 204 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.204.out: No such file or directory

Test Suite: test_pnetcdf

Results

Duration4.622 sec
Tests2
Failures2

Tests

test_pnetcdf

Test case:[libs/NetCDF] C parallel I/O (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.238 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file test_pnetcdf, line 28)
  `run_mpi_binary ./C_parallel "atest" 2 4' failed
job script = /tmp/job.ohpc-test.27071
Batch job 208 submitted

Job 208 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.208.out: No such file or directory
Test case:[libs/NetCDF] Fortran parallel I/O (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.384 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file test_pnetcdf, line 64)
  `run_mpi_binary -t "00:02:00" ./F90_parallel "atest" 2 4' failed
job script = /tmp/job.ohpc-test.30056
Batch job 209 submitted

Job 209 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.209.out: No such file or directory

Test Suite: rm_execution

Results

Duration1.379 sec
Tests11
Failures11

Tests

rm_execution

Test case:[libs/openblas/dblat1] dblat1 under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.128 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 25)
  `run_serial_binary ./dblat1 $ARGS' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./dblat1: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/openblas/xccblat1] xccblat1 under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.116 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 34)
  `run_serial_binary ./xccblat1 $ARGS' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./xccblat1: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/openblas/xzcblat1] xzcblat1 under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.145 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 43)
  `run_serial_binary ./xzcblat1 $ARGS' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./xzcblat1: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/openblas/xscblat2] xscblat2 under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.123 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 52)
  `run_serial_binary ./xscblat2 "$ARGS < sin2"' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./xscblat2: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/openblas/xdcblat2] xdcblat2 under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.12 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 61)
  `run_serial_binary ./xdcblat2 "$ARGS < din2"' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./xdcblat2: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/openblas/xccblat2] xccblat2 under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.122 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 70)
  `run_serial_binary ./xccblat2 "$ARGS < cin2"' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./xccblat2: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/openblas/xzcblat2] xzcblat2 under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.126 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 79)
  `run_serial_binary ./xzcblat2 "$ARGS < zin2"' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./xzcblat2: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/openblas/xscblat3] xscblat3 under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.123 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 88)
  `run_serial_binary ./xscblat3 "$ARGS < sin3"' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./xscblat3: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/openblas/xdcblat3] xdcblat3 under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.127 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 97)
  `run_serial_binary ./xdcblat3 "$ARGS < din3"' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./xdcblat3: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/openblas/xccblat3] xccblat3 under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.121 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 106)
  `run_serial_binary ./xccblat3 "$ARGS < cin3"' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./xccblat3: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/openblas/xzcblat3] xzcblat3 under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.128 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 115)
  `run_serial_binary ./xzcblat3 "$ARGS < zin3"' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/openblas/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./xzcblat3: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: test_lapack

Results

Duration0.268 sec
Tests1
Failures0

Tests

test_lapack

Test case:[libs/OpenBLAS/eigen] run lapack eigen-value solver (gnu14)
Outcome:Passed
Duration:0.268 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.251 sec
Tests5
Failures0

Tests

test_module

Test case:[libs/OpenBLAS] Verify OPENBLAS module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.17 sec
FailedNone
None
Test case:[libs/OpenBLAS] Verify module OPENBLAS_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/OpenBLAS] Verify module OPENBLAS_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/OpenBLAS] Verify dynamic library available in OPENBLAS_LIB (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/OpenBLAS] Verify static library is not present in OPENBLAS_LIB (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.294 sec
Tests2
Failures1

Tests

rm_execution

Test case:[libs/OpenCoarrays] hello_multiverse binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.294 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 39)
  `assert_success' failed

-- command failed --
status : 1
output (8 lines):
  job script = /tmp/job.ohpc-test.1802
  Batch job 224 submitted

  Job 224 failed...
  Reason=RaisedSignal:53(Real-time_signal_19)

  cat: job.224.out: No such file or directory
  grep: job.224.out: No such file or directory
--
Test case:[libs/OpenCoarrays] hello_multiverse binary runs under resource manager using cafrun script (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
cafrun not supported for all MPI variants

Test Suite: sms_execution

Results

Duration0.453 sec
Tests1
Failures0

Tests

sms_execution

Test case:[libs/OpenCoarrays] hello_multiverse binary runs on head node (gnu14/mpich)
Outcome:Passed
Duration:0.453 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.352 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/opencoarrays] Verify opencoarrays module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.186 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify dynamic library available in OPENCOARRAYS_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify static library is not present in OPENCOARRAYS_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify header file is present in OPENCOARRAYS_INC (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify F90 module is present in OPENCOARRAYS_INC (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration2.367 sec
Tests2
Failures1

Tests

rm_execution

Test case:[libs/OpenCoarrays] hello_multiverse binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.367 sec
FailedNone
(from function `assert_success' in file ./common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 39)
  `assert_success' failed

-- command failed --
status : 1
output (8 lines):
  job script = /tmp/job.ohpc-test.947
  Batch job 225 submitted

  Job 225 failed...
  Reason=RaisedSignal:53(Real-time_signal_19)

  cat: job.225.out: No such file or directory
  grep: job.225.out: No such file or directory
--
Test case:[libs/OpenCoarrays] hello_multiverse binary runs under resource manager using cafrun script (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
cafrun not supported for all MPI variants

Test Suite: sms_execution

Results

Duration1.541 sec
Tests1
Failures0

Tests

sms_execution

Test case:[libs/OpenCoarrays] hello_multiverse binary runs on head node (gnu14/openmpi5)
Outcome:Passed
Duration:1.541 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.357 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/opencoarrays] Verify opencoarrays module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.187 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify dynamic library available in OPENCOARRAYS_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify static library is not present in OPENCOARRAYS_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify header file is present in OPENCOARRAYS_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify F90 module is present in OPENCOARRAYS_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/opencoarrays] Verify module OPENCOARRAYS_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration6.882 sec
Tests3
Failures3

Tests

rm_execution

Test case:[libs/PETSc] MPI C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.295 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 25)
  `run_mpi_binary ./C_mpi_test $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.11216
Batch job 227 submitted

Job 227 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.227.out: No such file or directory
Test case:[libs/PETSc] MPI F77 binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.296 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 34)
  `run_mpi_binary ./F_test $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.15089
Batch job 228 submitted

Job 228 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.228.out: No such file or directory
Test case:[libs/PETSc] MPI F90 binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.291 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 43)
  `run_mpi_binary ./F90_test "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.8281
Batch job 229 submitted

Job 229 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.229.out: No such file or directory

Test Suite: test_module

Results

Duration1.582 sec
Tests9
Failures1

Tests

test_module

Test case:[libs/PETSc] Verify PETSC module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.21 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_BIN is defined and exists
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Verify dynamic library available in PETSC_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Verify static library is not present in PETSC_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Verify header file is present in PETSC_INC (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Sample job (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.22 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file test_module, line 130)
  `run_mpi_binary ./C_test "atest" 1 1' failed
job script = /tmp/job.ohpc-test.24011
Batch job 226 submitted

Job 226 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.226.out: No such file or directory

Test Suite: rm_execution

Results

Duration9.063 sec
Tests3
Failures3

Tests

rm_execution

Test case:[libs/PETSc] MPI C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.373 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 25)
  `run_mpi_binary ./C_mpi_test $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.27686
Batch job 231 submitted

Job 231 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.231.out: No such file or directory
Test case:[libs/PETSc] MPI F77 binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.307 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 34)
  `run_mpi_binary ./F_test $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.32297
Batch job 232 submitted

Job 232 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.232.out: No such file or directory
Test case:[libs/PETSc] MPI F90 binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.383 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 43)
  `run_mpi_binary ./F90_test "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.31112
Batch job 233 submitted

Job 233 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.233.out: No such file or directory

Test Suite: test_module

Results

Duration1.599 sec
Tests9
Failures1

Tests

test_module

Test case:[libs/PETSc] Verify PETSC module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.213 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_BIN is defined and exists
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/PETSc] Verify dynamic library available in PETSC_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/PETSc] Verify static library is not present in PETSC_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Verify module PETSC_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Verify header file is present in PETSC_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PETSc] Sample job (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.231 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file test_module, line 130)
  `run_mpi_binary ./C_test "atest" 1 1' failed
job script = /tmp/job.ohpc-test.5788
Batch job 230 submitted

Job 230 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.230.out: No such file or directory

Test Suite: test_module

Results

Duration0.322 sec
Tests7
Failures0

Tests

test_module

Test case:[HDF5] Verify HDF5 module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.193 sec
FailedNone
None
Test case:[HDF5] Verify module HDF5_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[HDF5] Verify module HDF5_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[HDF5] Verify dynamic library available in HDF5_LIB (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[HDF5] Verify static library is not present in HDF5_LIB (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[HDF5] Verify module HDF5_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[HDF5] Verify header file is present in HDF5_INC (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration4.563 sec
Tests2
Failures2

Tests

rm_execution

Test case:[libs/PHDF5] MPI C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.219 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 24)
  `run_mpi_binary -t $CMD_TIMEOUT ./C_mpi_test $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.4505
Batch job 234 submitted

Job 234 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.234.out: No such file or directory
Test case:[libs/PHDF5] Parallel Fortran binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.344 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 33)
  `run_mpi_binary -t $CMD_TIMEOUT ./F_mpi_test $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.806
Batch job 235 submitted

Job 235 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.235.out: No such file or directory

Test Suite: rm_execution

Results

Duration4.577 sec
Tests2
Failures2

Tests

rm_execution

Test case:[libs/PHDF5] MPI C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.22 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 24)
  `run_mpi_binary -t $CMD_TIMEOUT ./C_mpi_test $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.458
Batch job 236 submitted

Job 236 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.236.out: No such file or directory
Test case:[libs/PHDF5] Parallel Fortran binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.357 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 33)
  `run_mpi_binary -t $CMD_TIMEOUT ./F_mpi_test $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.2817
Batch job 237 submitted

Job 237 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.237.out: No such file or directory

Test Suite: rm_execution

Results

Duration0.258 sec
Tests3
Failures2

Tests

rm_execution

Test case:[libs/PLASMA/C_test] C_test under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.125 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 29)
  `run_serial_binary ./C_test $ARGS' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/plasma/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/plasma/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./C_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/PLASMA/F_test] F_test under resource manager (slurm/gnu14)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[libs/PLASMA/F90_test] F90_test under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.133 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 48)
  `run_serial_binary ./F90_test "< gebrd_example.d"' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/plasma/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/plasma/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./F90_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: test_module

Results

Duration0.295 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/PLASMA] Verify plasma module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.17 sec
FailedNone
None
Test case:[libs/PLASMA] Verify module PLASMA_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PLASMA] Verify module PLASMA_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PLASMA] Verify dynamic library available in PLASMA_LIB (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PLASMA] Verify static library is not present in PLASMA_LIB (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PLASMA] Verify module PLASMA_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PLASMA] Verify header file is present in PLASMA_INC (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: test_module

Results

Duration0.272 sec
Tests5
Failures0

Tests

test_module

Test case:[PNETCDF] Verify PNETCDF module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.188 sec
FailedNone
None
Test case:[PNETCDF] Verify module PNETCDF_DIR is defined and exists (gnu14)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[PNETCDF] Verify module PNETCDF_LIB is defined and exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[PNETCDF] Verify module PNETCDF_INC is defined and exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[PNETCDF] Verify header file is present in PNETCDF_INC (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration9.097 sec
Tests4
Failures4

Tests

rm_execution

Test case:[libs/PNETCDF] Parallel Fortran binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.208 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 22)
  `run_mpi_binary ./f90tst_parallel f90tst_parallel.nc $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.19997
Batch job 240 submitted

Job 240 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.240.out: No such file or directory
Test case:[libs/PNETCDF] Parallel Fortran binary 2 runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.339 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 31)
  `run_mpi_binary ./f90tst_parallel2 f90tst_parallel2.nc $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.26583
Batch job 241 submitted

Job 241 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.241.out: No such file or directory
Test case:[libs/PNETCDF] Parallel Fortran binary 3 runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.338 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 40)
  `run_mpi_binary ./f90tst_parallel3 f90tst_parallel3.nc $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.30202
Batch job 242 submitted

Job 242 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.242.out: No such file or directory
Test case:[libs/PNETCDF] Parallel Fortran binary 4 runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.212 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 49)
  `run_mpi_binary ./f90tst_parallel4 f90tst_parallel4.nc $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.18565
Batch job 243 submitted

Job 243 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.243.out: No such file or directory

Test Suite: rm_execution

Results

Duration9.129 sec
Tests4
Failures4

Tests

rm_execution

Test case:[libs/PNETCDF] Parallel Fortran binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.218 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 22)
  `run_mpi_binary ./f90tst_parallel f90tst_parallel.nc $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.18630
Batch job 244 submitted

Job 244 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.244.out: No such file or directory
Test case:[libs/PNETCDF] Parallel Fortran binary 2 runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.344 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 31)
  `run_mpi_binary ./f90tst_parallel2 f90tst_parallel2.nc $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.31721
Batch job 245 submitted

Job 245 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.245.out: No such file or directory
Test case:[libs/PNETCDF] Parallel Fortran binary 3 runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.214 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 40)
  `run_mpi_binary ./f90tst_parallel3 f90tst_parallel3.nc $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.6206
Batch job 246 submitted

Job 246 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.246.out: No such file or directory
Test case:[libs/PNETCDF] Parallel Fortran binary 4 runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.353 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 49)
  `run_mpi_binary ./f90tst_parallel4 f90tst_parallel4.nc $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.15557
Batch job 247 submitted

Job 247 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.247.out: No such file or directory

Test Suite: rm_execution

Results

Duration5.777 sec
Tests3
Failures3

Tests

rm_execution

Test case:[libs/PTScotch] dgraph_redist binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.215 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 22)
  `run_mpi_binary ./dgraph_redist "bump.grf" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.18913
Batch job 248 submitted

Job 248 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.248.out: No such file or directory
Test case:[libs/PTScotch] strat_par binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.214 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 31)
  `run_mpi_binary ./strat_par "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.9711
Batch job 249 submitted

Job 249 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.249.out: No such file or directory
Test case:[libs/PTScotch] dgord binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.348 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 40)
  `run_mpi_binary dgord "bump.grf /dev/null -vt" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.11436
Batch job 250 submitted

Job 250 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.250.out: No such file or directory

Test Suite: test_module

Results

Duration0.362 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/PTScotch] Verify PTSCOTCH module is loaded and matches rpm version (gnu14-mpich)
Outcome:Passed
Duration:0.184 sec
FailedNone
None
Test case:[libs/PTScotch] Verify PTSCOTCH_DIR is defined and directory exists (gnu14-mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PTScotch] Verify module PTSCOTCH_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PTScotch] Verify dynamic library available in PTSCOTCH_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PTScotch] Verify static library is not present in PTSCOTCH_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PTScotch] Verify module PTSCOTCH_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PTScotch] Verify header file is present in PTSCOTCH_INC (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PTScotch] Verify module PTSCOTCH_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/PTScotch] Verify availability of dgscat binary (gnu14/mpich)
Outcome:Passed
Duration:0.032 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration6.869 sec
Tests3
Failures3

Tests

rm_execution

Test case:[libs/PTScotch] dgraph_redist binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.22 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 22)
  `run_mpi_binary ./dgraph_redist "bump.grf" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.20744
Batch job 251 submitted

Job 251 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.251.out: No such file or directory
Test case:[libs/PTScotch] strat_par binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.29 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 31)
  `run_mpi_binary ./strat_par "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.23702
Batch job 252 submitted

Job 252 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.252.out: No such file or directory
Test case:[libs/PTScotch] dgord binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.359 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 40)
  `run_mpi_binary dgord "bump.grf /dev/null -vt" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.11994
Batch job 253 submitted

Job 253 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.253.out: No such file or directory

Test Suite: test_module

Results

Duration0.373 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/PTScotch] Verify PTSCOTCH module is loaded and matches rpm version (gnu14-openmpi5)
Outcome:Passed
Duration:0.192 sec
FailedNone
None
Test case:[libs/PTScotch] Verify PTSCOTCH_DIR is defined and directory exists (gnu14-openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PTScotch] Verify module PTSCOTCH_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PTScotch] Verify dynamic library available in PTSCOTCH_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/PTScotch] Verify static library is not present in PTSCOTCH_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PTScotch] Verify module PTSCOTCH_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PTScotch] Verify header file is present in PTSCOTCH_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/PTScotch] Verify module PTSCOTCH_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/PTScotch] Verify availability of dgscat binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.034 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration10.207 sec
Tests4
Failures4

Tests

rm_execution

Test case:[libs/ScaLAPACK/PCSCAEX] CPCGESV under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.216 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 30)
  `run_mpi_binary ./pcscaex $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.9728
Batch job 255 submitted

Job 255 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.255.out: No such file or directory
Test case:[libs/ScaLAPACK/PDSCAEX] DPCGESV under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.356 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 39)
  `run_mpi_binary ./pdscaex $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.6974
Batch job 256 submitted

Job 256 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.256.out: No such file or directory
Test case:[libs/ScaLAPACK/PSSCAEX] SPCGESV under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.28 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 48)
  `run_mpi_binary ./psscaex $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.18106
Batch job 257 submitted

Job 257 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.257.out: No such file or directory
Test case:[libs/ScaLAPACK/PZSCAEX] ZPCGESV under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.355 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 57)
  `run_mpi_binary ./pzscaex $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.28117
Batch job 258 submitted

Job 258 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.258.out: No such file or directory

Test Suite: test_module

Results

Duration0.264 sec
Tests5
Failures0

Tests

test_module

Test case:[libs/ScaLAPACK] Verify SCALAPACK module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.181 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify module SCALAPACK_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify module SCALAPACK_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify dynamic library available in SCALAPACK_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify static library is not present in SCALAPACK_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration9.168 sec
Tests4
Failures4

Tests

rm_execution

Test case:[libs/ScaLAPACK/PCSCAEX] CPCGESV under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.226 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 30)
  `run_mpi_binary ./pcscaex $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.21711
Batch job 259 submitted

Job 259 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.259.out: No such file or directory
Test case:[libs/ScaLAPACK/PDSCAEX] DPCGESV under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.291 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 39)
  `run_mpi_binary ./pdscaex $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.10555
Batch job 260 submitted

Job 260 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.260.out: No such file or directory
Test case:[libs/ScaLAPACK/PSSCAEX] SPCGESV under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.361 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 48)
  `run_mpi_binary ./psscaex $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.27784
Batch job 261 submitted

Job 261 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.261.out: No such file or directory
Test case:[libs/ScaLAPACK/PZSCAEX] ZPCGESV under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.29 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 57)
  `run_mpi_binary ./pzscaex $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.3119
Batch job 262 submitted

Job 262 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.262.out: No such file or directory

Test Suite: test_module

Results

Duration0.276 sec
Tests5
Failures0

Tests

test_module

Test case:[libs/ScaLAPACK] Verify SCALAPACK module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.191 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify module SCALAPACK_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify module SCALAPACK_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify dynamic library available in SCALAPACK_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/ScaLAPACK] Verify static library is not present in SCALAPACK_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.376 sec
Tests3
Failures3

Tests

rm_execution

Test case:[libs/scotch] graph_map binary runs under resource manager (slurm/gnu14/)
Outcome:Failed
Duration:0.129 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 19)
  `run_serial_binary ./graph_map "m4x4.grf"' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/scotch/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/scotch/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./graph_map: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/scotch] strat_seq binary runs under resource manager (slurm/gnu14/)
Outcome:Failed
Duration:0.124 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 28)
  `run_serial_binary ./strat_seq ""' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/scotch/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/scotch/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./strat_seq: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/scotch] gout binary runs under resource manager (slurm/gnu14/)
Outcome:Failed
Duration:0.123 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 37)
  `run_serial_binary gout "-om -gn -mn m4x4.grf - - m4x4.ps"' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/scotch/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/scotch/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): gout: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: test_module

Results

Duration0.357 sec
Tests9
Failures0

Tests

test_module

Test case:[libs/Scotch] Verify SCOTCH module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.177 sec
FailedNone
None
Test case:[libs/Scotch] Verify SCOTCH_DIR is defined and directory exists (gnu14)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/Scotch] Verify module SCOTCH_LIB is defined and exists (gnu14/)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/Scotch] Verify dynamic library available in SCOTCH_LIB (gnu14/)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/Scotch] Verify static library is not present in SCOTCH_LIB (gnu14/)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[libs/Scotch] Verify module SCOTCH_INC is defined and exists (gnu14/)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/Scotch] Verify header file is present in SCOTCH_INC (gnu14/)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/Scotch] Verify module SCOTCH_BIN is defined and exists
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/Scotch] Verify availability of gout binary
Outcome:Passed
Duration:0.034 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration9.194 sec
Tests4
Failures4

Tests

rm_execution

Test case:[libs/SLEPc] F90 SVD test binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.23 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 23)
  `run_mpi_binary ./test4f "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.23664
Batch job 266 submitted

Job 266 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.266.out: No such file or directory
Test case:[libs/SLEPc] C SVD of the Lauchli matrix binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.374 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 32)
  `run_mpi_binary ./ex15 "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.1801
Batch job 267 submitted

Job 267 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.267.out: No such file or directory
Test case:[libs/SLEPc] F90 quadratic eigensystem with PEP object binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.224 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 41)
  `run_mpi_binary ./ex16f90 " -pep_nev 4 -terse " 1 1' failed
job script = /tmp/job.ohpc-test.90
Batch job 268 submitted

Job 268 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.268.out: No such file or directory
Test case:[libs/SLEPc] C nonsymmetric eignenproblem binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.366 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 50)
  `run_mpi_binary ./ex29 "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.21821
Batch job 269 submitted

Job 269 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.269.out: No such file or directory

Test Suite: test_module

Results

Duration0.337 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/slepc] Verify slepc module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.206 sec
FailedNone
None
Test case:[libs/slepc] Verify module SLEPC_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[libs/slepc] Verify module SLEPC_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/slepc] Verify dynamic library available in SLEPC_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/slepc] Verify static library is not present in SLEPC_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/slepc] Verify module SLEPC_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/slepc] Verify header file is present in SLEPC_INC (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration10.315 sec
Tests4
Failures4

Tests

rm_execution

Test case:[libs/SLEPc] F90 SVD test binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.234 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 23)
  `run_mpi_binary ./test4f "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.2687
Batch job 270 submitted

Job 270 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.270.out: No such file or directory
Test case:[libs/SLEPc] C SVD of the Lauchli matrix binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.236 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 32)
  `run_mpi_binary ./ex15 "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.27112
Batch job 271 submitted

Job 271 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.271.out: No such file or directory
Test case:[libs/SLEPc] F90 quadratic eigensystem with PEP object binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.383 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 41)
  `run_mpi_binary ./ex16f90 " -pep_nev 4 -terse " 1 1' failed
job script = /tmp/job.ohpc-test.18966
Batch job 272 submitted

Job 272 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.272.out: No such file or directory
Test case:[libs/SLEPc] C nonsymmetric eignenproblem binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:4.462 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 50)
  `run_mpi_binary ./ex29 "" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.29592
Batch job 273 submitted

Job 273 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.273.out: No such file or directory

Test Suite: test_module

Results

Duration0.344 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/slepc] Verify slepc module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.211 sec
FailedNone
None
Test case:[libs/slepc] Verify module SLEPC_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.024 sec
FailedNone
None
Test case:[libs/slepc] Verify module SLEPC_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/slepc] Verify dynamic library available in SLEPC_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/slepc] Verify static library is not present in SLEPC_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/slepc] Verify module SLEPC_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/slepc] Verify header file is present in SLEPC_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration0.25 sec
Tests2
Failures2

Tests

rm_execution

Test case:[libs/SuperLU] C test runs under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.13 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 26)
  `run_serial_binary $EXE' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/superlu/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/superlu/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./superlu: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[libs/SuperLU] F77 test runs under resource manager (slurm/gnu14)
Outcome:Failed
Duration:0.12 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 36)
  `run_serial_binary "$EXE < g20.rua"' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/superlu/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/libs/superlu/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./superlu: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: test_module

Results

Duration0.203 sec
Tests2
Failures0

Tests

test_module

Test case:[libs/SUPERLU] Verify SUPERLU module is loaded and matches rpm version (gnu14)
Outcome:Passed
Duration:0.182 sec
FailedNone
None
Test case:[libs/SUPERLU] Verify SUPERLU_DIR is defined and directory exists (gnu14)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration19.694 sec
Tests9
Failures9

Tests

rm_execution

Test case:[libs/superLU_dist] PDGSSVX with full (default) options (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.236 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 34)
  `run_mpi_binary ./pddrive "-r 2 -c 2 g20.rua" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.20182
Batch job 285 submitted

Job 285 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.285.out: No such file or directory
Test case:[libs/superLU_dist] pdgssvx_ABglobal with full (default) options (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.231 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 43)
  `run_mpi_binary ./pddrive_ABglobal "-r 1 -c 1 g20.rua" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.31066
Batch job 286 submitted

Job 286 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.286.out: No such file or directory
Test case:[libs/superLU_dist] vary RHS (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:4.46 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 52)
  `run_mpi_binary ./pddrive1 "-r 2 -c 2 g20.rua" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.6883
Batch job 287 submitted

Job 287 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.287.out: No such file or directory
Test case:[libs/superLU_dist] vary RHS ABglobal (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.234 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 61)
  `run_mpi_binary ./pddrive1_ABglobal "-r 1 -c 1 g20.rua" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.13912
Batch job 288 submitted

Job 288 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.288.out: No such file or directory
Test case:[libs/superLU_dist] reuse permutation vector (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.383 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 70)
  `run_mpi_binary ./pddrive2 "-r 2 -c 2 g20.rua" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.6793
Batch job 289 submitted

Job 289 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.289.out: No such file or directory
Test case:[libs/superLU_dist] reuse permutation vector ABglobal (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.38 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 79)
  `run_mpi_binary ./pddrive2_ABglobal "-r 1 -c 1 g20.rua" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.4655
Batch job 290 submitted

Job 290 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.290.out: No such file or directory
Test case:[libs/superLU_dist] reuse symbolic factorization (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.308 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 88)
  `run_mpi_binary ./pddrive3 "-r 2 -c 2 g20.rua" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.11789
Batch job 291 submitted

Job 291 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.291.out: No such file or directory
Test case:[libs/superLU_dist] reuse symbolic factorization ABglobal (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.228 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 97)
  `run_mpi_binary ./pddrive3_ABglobal "-r 1 -c 1 g20.rua" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.13685
Batch job 292 submitted

Job 292 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.292.out: No such file or directory
Test case:[libs/superLU_dist] multi-grid (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.234 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 110)
  `run_mpi_binary ./pddrive4 "g20.rua" $NODES 10' failed
job script = /tmp/job.ohpc-test.16663
Batch job 293 submitted

Job 293 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.293.out: No such file or directory

Test Suite: test_module

Results

Duration0.333 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/superlu_dist] Verify SUPERLU_DIST module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.205 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify module SUPERLU_DIST_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify module SUPERLU_DIST_LIB is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify dynamic library available in SUPERLU_DIST_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify static library is not present in SUPERLU_DIST_LIB (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify module SUPERLU_DIST_INC is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify header file is present in SUPERLU_DIST_INC (gnu14/openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration23.856 sec
Tests9
Failures9

Tests

rm_execution

Test case:[libs/superLU_dist] PDGSSVX with full (default) options (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.226 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 34)
  `run_mpi_binary ./pddrive "-r 2 -c 2 g20.rua" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.17152
Batch job 276 submitted

Job 276 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.276.out: No such file or directory
Test case:[libs/superLU_dist] pdgssvx_ABglobal with full (default) options (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.288 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 43)
  `run_mpi_binary ./pddrive_ABglobal "-r 1 -c 1 g20.rua" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.18028
Batch job 277 submitted

Job 277 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.277.out: No such file or directory
Test case:[libs/superLU_dist] vary RHS (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.295 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 52)
  `run_mpi_binary ./pddrive1 "-r 2 -c 2 g20.rua" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.16750
Batch job 278 submitted

Job 278 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.278.out: No such file or directory
Test case:[libs/superLU_dist] vary RHS ABglobal (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.363 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 61)
  `run_mpi_binary ./pddrive1_ABglobal "-r 1 -c 1 g20.rua" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.28620
Batch job 279 submitted

Job 279 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.279.out: No such file or directory
Test case:[libs/superLU_dist] reuse permutation vector (slurm/gnu14/mpich)
Outcome:Failed
Duration:4.429 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 70)
  `run_mpi_binary ./pddrive2 "-r 2 -c 2 g20.rua" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.18964
Batch job 280 submitted

Job 280 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.280.out: No such file or directory
Test case:[libs/superLU_dist] reuse permutation vector ABglobal (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.223 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 79)
  `run_mpi_binary ./pddrive2_ABglobal "-r 1 -c 1 g20.rua" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.1023
Batch job 281 submitted

Job 281 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.281.out: No such file or directory
Test case:[libs/superLU_dist] reuse symbolic factorization (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.371 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 88)
  `run_mpi_binary ./pddrive3 "-r 2 -c 2 g20.rua" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.28833
Batch job 282 submitted

Job 282 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.282.out: No such file or directory
Test case:[libs/superLU_dist] reuse symbolic factorization ABglobal (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.363 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 97)
  `run_mpi_binary ./pddrive3_ABglobal "-r 1 -c 1 g20.rua" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.7660
Batch job 283 submitted

Job 283 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.283.out: No such file or directory
Test case:[libs/superLU_dist] multi-grid (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.298 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 110)
  `run_mpi_binary ./pddrive4 "g20.rua" $NODES 10' failed
job script = /tmp/job.ohpc-test.4144
Batch job 284 submitted

Job 284 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.284.out: No such file or directory

Test Suite: test_module

Results

Duration0.334 sec
Tests7
Failures0

Tests

test_module

Test case:[libs/superlu_dist] Verify SUPERLU_DIST module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.207 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify module SUPERLU_DIST_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify module SUPERLU_DIST_LIB is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify dynamic library available in SUPERLU_DIST_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify static library is not present in SUPERLU_DIST_LIB (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify module SUPERLU_DIST_INC is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[libs/superlu_dist] Verify header file is present in SUPERLU_DIST_INC (gnu14/mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None

Test Suite: build

Results

Duration3.209 sec
Tests2
Failures0

Tests

build

Test case:[Trilinos] verify availability of Makefile.export.Trilinos (gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
deprecated with newer Trilinos
Test case:[Trilinos] build Trilinos executables (gnu14/openmpi5)
Outcome:Passed
Duration:3.209 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration17.2 sec
Tests9
Failures7

Tests

rm_execution

Test case:[libs/Trilinos] Kokkos-MemorySpace runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.235 sec
FailedNone
(from function `run_mpi_binary' in file ../../../common/functions, line 403,
 in test file rm_execution, line 26)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.29028
Batch job 301 submitted

Job 301 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.301.out: No such file or directory
Test case:[libs/Trilinos] Tpetra-InitMPI runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.225 sec
FailedNone
(from function `run_mpi_binary' in file ../../../common/functions, line 403,
 in test file rm_execution, line 36)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.9561
Batch job 302 submitted

Job 302 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.302.out: No such file or directory
Test case:[libs/Trilinos] Tpetra-DataRedistribution runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.377 sec
FailedNone
(from function `run_mpi_binary' in file ../../../common/functions, line 403,
 in test file rm_execution, line 46)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.423
Batch job 303 submitted

Job 303 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.303.out: No such file or directory
Test case:[libs/Trilinos] Epetra-DataRedistribution runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:4.452 sec
FailedNone
(from function `run_mpi_binary' in file ../../../common/functions, line 403,
 in test file rm_execution, line 56)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.23266
Batch job 304 submitted

Job 304 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.304.out: No such file or directory
Test case:[libs/Trilinos] Epetra-Galeri runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skip Epetra-Galeri in 2.x
Test case:[libs/Trilinos] Epetra-Ifpack runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skip Epetra-Ifpack in 2.x
Test case:[libs/Trilinos] Teuchos-ParameterList runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.232 sec
FailedNone
(from function `run_mpi_binary' in file ../../../common/functions, line 403,
 in test file rm_execution, line 88)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.21802
Batch job 305 submitted

Job 305 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.305.out: No such file or directory
Test case:[libs/Trilinos] Teuchos-LAPACK runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.369 sec
FailedNone
(from function `run_mpi_binary' in file ../../../common/functions, line 403,
 in test file rm_execution, line 98)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.16902
Batch job 306 submitted

Job 306 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.306.out: No such file or directory
Test case:[libs/Trilinos] Teuchos-BLAS runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.31 sec
FailedNone
(from function `run_mpi_binary' in file ../../../common/functions, line 403,
 in test file rm_execution, line 108)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.28594
Batch job 307 submitted

Job 307 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.307.out: No such file or directory

Test Suite: build

Results

Duration4.626 sec
Tests2
Failures0

Tests

build

Test case:[Trilinos] verify availability of Makefile.export.Trilinos (gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
deprecated with newer Trilinos
Test case:[Trilinos] build Trilinos executables (gnu14/mpich)
Outcome:Passed
Duration:4.626 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration16.041 sec
Tests9
Failures7

Tests

rm_execution

Test case:[libs/Trilinos] Kokkos-MemorySpace runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.22 sec
FailedNone
(from function `run_mpi_binary' in file ../../../common/functions, line 403,
 in test file rm_execution, line 26)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.21495
Batch job 294 submitted

Job 294 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.294.out: No such file or directory
Test case:[libs/Trilinos] Tpetra-InitMPI runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.286 sec
FailedNone
(from function `run_mpi_binary' in file ../../../common/functions, line 403,
 in test file rm_execution, line 36)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.32271
Batch job 295 submitted

Job 295 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.295.out: No such file or directory
Test case:[libs/Trilinos] Tpetra-DataRedistribution runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.357 sec
FailedNone
(from function `run_mpi_binary' in file ../../../common/functions, line 403,
 in test file rm_execution, line 46)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.29857
Batch job 296 submitted

Job 296 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.296.out: No such file or directory
Test case:[libs/Trilinos] Epetra-DataRedistribution runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.361 sec
FailedNone
(from function `run_mpi_binary' in file ../../../common/functions, line 403,
 in test file rm_execution, line 56)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.24220
Batch job 297 submitted

Job 297 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.297.out: No such file or directory
Test case:[libs/Trilinos] Epetra-Galeri runs under resource manager (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skip Epetra-Galeri in 2.x
Test case:[libs/Trilinos] Epetra-Ifpack runs under resource manager (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skip Epetra-Ifpack in 2.x
Test case:[libs/Trilinos] Teuchos-ParameterList runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.22 sec
FailedNone
(from function `run_mpi_binary' in file ../../../common/functions, line 403,
 in test file rm_execution, line 88)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.3776
Batch job 298 submitted

Job 298 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.298.out: No such file or directory
Test case:[libs/Trilinos] Teuchos-LAPACK runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.37 sec
FailedNone
(from function `run_mpi_binary' in file ../../../common/functions, line 403,
 in test file rm_execution, line 98)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.5844
Batch job 299 submitted

Job 299 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.299.out: No such file or directory
Test case:[libs/Trilinos] Teuchos-BLAS runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.227 sec
FailedNone
(from function `run_mpi_binary' in file ../../../common/functions, line 403,
 in test file rm_execution, line 108)
  `run_mpi_binary $EXE $ARGS $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.17969
Batch job 300 submitted

Job 300 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.300.out: No such file or directory

Test Suite: lmod_installed

Results

Duration0.047 sec
Tests1
Failures0

Tests

lmod_installed

Test case:[modules] Check if lmod RPM installed
Outcome:Passed
Duration:0.047 sec
FailedNone
None

Test Suite: interactive_commands

Results

Duration5.314 sec
Tests8
Failures0

Tests

interactive_commands

Test case:[modules] module purge
Outcome:Passed
Duration:0.153 sec
FailedNone
None
Test case:[modules] module list
Outcome:Passed
Duration:0.253 sec
FailedNone
None
Test case:[modules] module help
Outcome:Passed
Duration:0.281 sec
FailedNone
None
Test case:[modules] module load/unload
Outcome:Passed
Duration:2.466 sec
FailedNone
None
Test case:[modules] module whatis
Outcome:Passed
Duration:0.275 sec
FailedNone
None
Test case:[modules] module swap
Outcome:Passed
Duration:0.548 sec
FailedNone
None
Test case:[modules] path updated
Outcome:Passed
Duration:0.78 sec
FailedNone
None
Test case:[modules] module depends-on
Outcome:Passed
Duration:0.558 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration1.808 sec
Tests4
Failures3

Tests

rm_execution

Test case:[modules] env variable passes through ()
Outcome:Failed
Duration:0.494 sec
FailedNone
(from function `assert_success' in file ../common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 27)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/modules': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/modules': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./test_env: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--
Test case:[modules] loaded module passes through ()
Outcome:Failed
Duration:0.475 sec
FailedNone
(from function `assert_success' in file ../common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 36)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/modules': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/modules': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./test_mod_passthrough: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--
Test case:[modules] module commands available in RMS job ()
Outcome:Failed
Duration:0.414 sec
FailedNone
(from function `assert_success' in file ../common/test_helper_functions.bash, line 58,
 in test file rm_execution, line 44)
  `assert_success' failed

-- command failed --
status : 2
output (4 lines):
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/modules': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/modules': No such file or directory: going to /tmp instead
  slurmstepd-c1: error: execve(): /tmp/./test_mod_cmd: No such file or directory
  srun: error: c1: task 0: Exited with exit code 2
--
Test case:[modules] module load propagates thru RMS ()
Outcome:Passed
Duration:0.425 sec
FailedNone
None

Test Suite: build

Results

Duration2.023 sec
Tests3
Failures0

Tests

build

Test case:[MPI] build/execute C binary (gnu14/mpich)
Outcome:Passed
Duration:0.621 sec
FailedNone
None
Test case:[MPI] build/execute C++ binary (gnu14/mpich)
Outcome:Passed
Duration:0.756 sec
FailedNone
None
Test case:[MPI] build/execute F90 binary (gnu14/mpich)
Outcome:Passed
Duration:0.646 sec
FailedNone
None

Test Suite: man_page_check

Results

Duration0.106 sec
Tests1
Failures0

Tests

man_page_check

Test case:[MPI] mpicc man page availible (gnu14/mpich)
Outcome:Passed
Duration:0.106 sec
FailedNone
None

Test Suite: rm_execution_multi_host

Results

Duration8.055 sec
Tests3
Failures3

Tests

rm_execution_multi_host

Test case:[MPI] C binary runs on two nodes under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.324 sec
FailedNone
(from function `assert_success' in file ../../common/test_helper_functions.bash, line 58,
 in test file rm_execution_multi_host, line 24)
  `assert_success' failed

-- command failed --
status : 1
output (8 lines):
  job script = /tmp/job.ohpc-test.4866
  Batch job 311 submitted

  Job 311 failed...
  Reason=RaisedSignal:53(Real-time_signal_19)

  cat: job.311.out: No such file or directory
  grep: job.311.out: No such file or directory
--
Test case:[MPI] C++ binary runs on two nodes under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.398 sec
FailedNone
(from function `assert_success' in file ../../common/test_helper_functions.bash, line 58,
 in test file rm_execution_multi_host, line 36)
  `assert_success' failed

-- command failed --
status : 1
output (8 lines):
  job script = /tmp/job.ohpc-test.15187
  Batch job 312 submitted

  Job 312 failed...
  Reason=RaisedSignal:53(Real-time_signal_19)

  cat: job.312.out: No such file or directory
  grep: job.312.out: No such file or directory
--
Test case:[MPI] F90 binary runs on two nodes under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.333 sec
FailedNone
(from function `assert_success' in file ../../common/test_helper_functions.bash, line 58,
 in test file rm_execution_multi_host, line 45)
  `assert_success' failed

-- command failed --
status : 1
output (8 lines):
  job script = /tmp/job.ohpc-test.230
  Batch job 313 submitted

  Job 313 failed...
  Reason=RaisedSignal:53(Real-time_signal_19)

  cat: job.313.out: No such file or directory
  grep: job.313.out: No such file or directory
--

Test Suite: rm_execution_single_host

Results

Duration5.964 sec
Tests3
Failures3

Tests

rm_execution_single_host

Test case:[MPI] C binary runs on single node under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.279 sec
FailedNone
(from function `assert_success' in file ../../common/test_helper_functions.bash, line 58,
 in test file rm_execution_single_host, line 24)
  `assert_success' failed

-- command failed --
status : 1
output (8 lines):
  job script = /tmp/job.ohpc-test.23665
  Batch job 308 submitted

  Job 308 failed...
  Reason=RaisedSignal:53(Real-time_signal_19)

  cat: job.308.out: No such file or directory
  grep: job.308.out: No such file or directory
--
Test case:[MPI] C++ binary runs on single node under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.347 sec
FailedNone
(from function `assert_success' in file ../../common/test_helper_functions.bash, line 58,
 in test file rm_execution_single_host, line 36)
  `assert_success' failed

-- command failed --
status : 1
output (8 lines):
  job script = /tmp/job.ohpc-test.7521
  Batch job 309 submitted

  Job 309 failed...
  Reason=RaisedSignal:53(Real-time_signal_19)

  cat: job.309.out: No such file or directory
  grep: job.309.out: No such file or directory
--
Test case:[MPI] F90 binary runs on single node under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.338 sec
FailedNone
(from function `assert_success' in file ../../common/test_helper_functions.bash, line 58,
 in test file rm_execution_single_host, line 45)
  `assert_success' failed

-- command failed --
status : 1
output (8 lines):
  job script = /tmp/job.ohpc-test.27904
  Batch job 310 submitted

  Job 310 failed...
  Reason=RaisedSignal:53(Real-time_signal_19)

  cat: job.310.out: No such file or directory
  grep: job.310.out: No such file or directory
--

Test Suite: version_match

Results

Duration0.565 sec
Tests3
Failures0

Tests

version_match

Test case:[MPI] MPI module loaded (gnu14/mpich)
Outcome:Passed
Duration:0.132 sec
FailedNone
None
Test case:[MPI] MPI module version available (gnu14/mpich)
Outcome:Passed
Duration:0.145 sec
FailedNone
None
Test case:[MPI] mpicc, mpicxx, and mpif90 versions match module (gnu14/mpich)
Outcome:Passed
Duration:0.288 sec
FailedNone
None

Test Suite: build

Results

Duration3.495 sec
Tests3
Failures0

Tests

build

Test case:[MPI] build/execute C binary (gnu14/openmpi5)
Outcome:Passed
Duration:1.69 sec
FailedNone
None
Test case:[MPI] build/execute C++ binary (gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
C++ not available with openmpi5
Test case:[MPI] build/execute F90 binary (gnu14/openmpi5)
Outcome:Passed
Duration:1.805 sec
FailedNone
None

Test Suite: man_page_check

Results

Duration0.098 sec
Tests1
Failures0

Tests

man_page_check

Test case:[MPI] mpicc man page availible (gnu14/openmpi5)
Outcome:Passed
Duration:0.098 sec
FailedNone
None

Test Suite: rm_execution_multi_host

Results

Duration4.699 sec
Tests3
Failures2

Tests

rm_execution_multi_host

Test case:[MPI] C binary runs on two nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.404 sec
FailedNone
(from function `assert_success' in file ../../common/test_helper_functions.bash, line 58,
 in test file rm_execution_multi_host, line 24)
  `assert_success' failed

-- command failed --
status : 1
output (8 lines):
  job script = /tmp/job.ohpc-test.5994
  Batch job 316 submitted

  Job 316 failed...
  Reason=RaisedSignal:53(Real-time_signal_19)

  cat: job.316.out: No such file or directory
  grep: job.316.out: No such file or directory
--
Test case:[MPI] C++ binary runs on two nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
C++ not available with openmpi5
Test case:[MPI] F90 binary runs on two nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.295 sec
FailedNone
(from function `assert_success' in file ../../common/test_helper_functions.bash, line 58,
 in test file rm_execution_multi_host, line 45)
  `assert_success' failed

-- command failed --
status : 1
output (8 lines):
  job script = /tmp/job.ohpc-test.2366
  Batch job 317 submitted

  Job 317 failed...
  Reason=RaisedSignal:53(Real-time_signal_19)

  cat: job.317.out: No such file or directory
  grep: job.317.out: No such file or directory
--

Test Suite: rm_execution_single_host

Results

Duration3.62 sec
Tests3
Failures2

Tests

rm_execution_single_host

Test case:[MPI] C binary runs on single node under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.281 sec
FailedNone
(from function `assert_success' in file ../../common/test_helper_functions.bash, line 58,
 in test file rm_execution_single_host, line 24)
  `assert_success' failed

-- command failed --
status : 1
output (8 lines):
  job script = /tmp/job.ohpc-test.27531
  Batch job 314 submitted

  Job 314 failed...
  Reason=RaisedSignal:53(Real-time_signal_19)

  cat: job.314.out: No such file or directory
  grep: job.314.out: No such file or directory
--
Test case:[MPI] C++ binary runs on single node under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
C++ not available with openmpi5
Test case:[MPI] F90 binary runs on single node under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.339 sec
FailedNone
(from function `assert_success' in file ../../common/test_helper_functions.bash, line 58,
 in test file rm_execution_single_host, line 45)
  `assert_success' failed

-- command failed --
status : 1
output (8 lines):
  job script = /tmp/job.ohpc-test.4743
  Batch job 315 submitted

  Job 315 failed...
  Reason=RaisedSignal:53(Real-time_signal_19)

  cat: job.315.out: No such file or directory
  grep: job.315.out: No such file or directory
--

Test Suite: version_match

Results

Duration0.598 sec
Tests3
Failures0

Tests

version_match

Test case:[MPI] MPI module loaded (gnu14/openmpi5)
Outcome:Passed
Duration:0.138 sec
FailedNone
None
Test case:[MPI] MPI module version available (gnu14/openmpi5)
Outcome:Passed
Duration:0.138 sec
FailedNone
None
Test case:[MPI] mpicc, mpicxx, and mpif90 versions match module (gnu14/openmpi5)
Outcome:Passed
Duration:0.322 sec
FailedNone
None

Test Suite: conman

Results

Duration0.162 sec
Tests3
Failures0

Tests

conman

Test case:[ConMan] Verify conman binary available
Outcome:Passed
Duration:0.042 sec
FailedNone
None
Test case:[ConMan] Verify rpm version matches binary
Outcome:Passed
Duration:0.062 sec
FailedNone
None
Test case:[ConMan] Verify man page availability
Outcome:Passed
Duration:0.058 sec
FailedNone
None

Test Suite: warewulf-ipmi

Results

Duration0.0 sec
Tests1
Failures0

Tests

warewulf-ipmi

Test case:[warewulf-ipmi] ipmitool lanplus protocol
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
Warewulf not installed, skipping warewulf-ipmitool check

Test Suite: test_module

Results

Duration1.984 sec
Tests6
Failures0

Tests

test_module

Test case:[Dimemas] Verify dimemas module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.198 sec
FailedNone
None
Test case:[Dimemas] Verify module DIMEMAS_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.03 sec
FailedNone
None
Test case:[Dimemas] Verify module DIMEMAS_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[Dimemas] Verify availability of prv2dim binary (gnu14/mpich)
Outcome:Passed
Duration:0.047 sec
FailedNone
None
Test case:[Dimemas] Verify availability of Dimemas binary (gnu14/mpich)
Outcome:Passed
Duration:0.045 sec
FailedNone
None
Test case:[Dimemas] Run Dimemas simulation (gnu14/mpich)
Outcome:Passed
Duration:1.63 sec
FailedNone
None

Test Suite: test_module

Results

Duration2.014 sec
Tests6
Failures0

Tests

test_module

Test case:[Dimemas] Verify dimemas module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.212 sec
FailedNone
None
Test case:[Dimemas] Verify module DIMEMAS_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[Dimemas] Verify module DIMEMAS_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[Dimemas] Verify availability of prv2dim binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.048 sec
FailedNone
None
Test case:[Dimemas] Verify availability of Dimemas binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.047 sec
FailedNone
None
Test case:[Dimemas] Run Dimemas simulation (gnu14/openmpi5)
Outcome:Passed
Duration:1.639 sec
FailedNone
None

Test Suite: test_imb_mpi1

Results

Duration1.212 sec
Tests1
Failures1

Tests

test_imb_mpi1

Test case:[Libs/IMB] run IMB-MPI1 on 2 nodes under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.212 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file test_imb_mpi1, line 38)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.17008
Batch job 318 submitted

Job 318 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.318.out: No such file or directory

Test Suite: test_imb_mpi2

Results

Duration4.55 sec
Tests2
Failures2

Tests

test_imb_mpi2

Test case:[Libs/IMB] run IMB-EXT on 2 nodes under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.207 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file test_imb_mpi2, line 41)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.19925
Batch job 319 submitted

Job 319 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.319.out: No such file or directory
Test case:[Libs/IMB] run IMB-IO on 2 nodes under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.343 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file test_imb_mpi2, line 53)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.29172
Batch job 320 submitted

Job 320 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.320.out: No such file or directory

Test Suite: test_imb_mpi3

Results

Duration1.213 sec
Tests2
Failures1

Tests

test_imb_mpi3

Test case:[Libs/IMB] run IMB-NBC on 2 nodes under resource manager (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[Libs/IMB] run IMB-RMA on 2 nodes under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.213 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file test_imb_mpi3, line 56)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.3200
Batch job 321 submitted

Job 321 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.321.out: No such file or directory

Test Suite: test_module

Results

Duration0.238 sec
Tests3
Failures0

Tests

test_module

Test case:[IMB] Verify IMB module is loaded and matches rpm version (gnu14-mpich)
Outcome:Passed
Duration:0.189 sec
FailedNone
None
Test case:[IMB] Verify IMB_DIR is defined and directory exists (gnu14-mpich)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[IMB] Verify executables are present in IMB_DIR/bin (gnu14-mpich)
Outcome:Passed
Duration:0.028 sec
FailedNone
None

Test Suite: test_imb_mpi1

Results

Duration1.221 sec
Tests1
Failures1

Tests

test_imb_mpi1

Test case:[Libs/IMB] run IMB-MPI1 on 2 nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.221 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file test_imb_mpi1, line 38)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.9364
Batch job 322 submitted

Job 322 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.322.out: No such file or directory

Test Suite: test_imb_mpi2

Results

Duration5.637 sec
Tests2
Failures2

Tests

test_imb_mpi2

Test case:[Libs/IMB] run IMB-EXT on 2 nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.214 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file test_imb_mpi2, line 41)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.15263
Batch job 323 submitted

Job 323 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.323.out: No such file or directory
Test case:[Libs/IMB] run IMB-IO on 2 nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:4.423 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file test_imb_mpi2, line 53)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.28543
Batch job 324 submitted

Job 324 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.324.out: No such file or directory

Test Suite: test_imb_mpi3

Results

Duration1.215 sec
Tests2
Failures1

Tests

test_imb_mpi3

Test case:[Libs/IMB] run IMB-NBC on 2 nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skipped
Test case:[Libs/IMB] run IMB-RMA on 2 nodes under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.215 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file test_imb_mpi3, line 56)
  `run_mpi_binary -t $CMD_TIMEOUT $EXE "$ARGS" $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.31284
Batch job 325 submitted

Job 325 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.325.out: No such file or directory

Test Suite: test_module

Results

Duration0.238 sec
Tests3
Failures0

Tests

test_module

Test case:[IMB] Verify IMB module is loaded and matches rpm version (gnu14-openmpi5)
Outcome:Passed
Duration:0.191 sec
FailedNone
None
Test case:[IMB] Verify IMB_DIR is defined and directory exists (gnu14-openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[IMB] Verify executables are present in IMB_DIR/bin (gnu14-openmpi5)
Outcome:Passed
Duration:0.026 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration2.432 sec
Tests2
Failures2

Tests

rm_execution

Test case:[OMB] run osu_bw on 2 nodes under resource manager (/gnu14/mpich)
Outcome:Failed
Duration:1.219 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 22)
  `run_mpi_binary -t "$CMD_TIMEOUT" "$EXE" "-m 512" 2 2' failed
job script = /tmp/job.ohpc-test.1668
Batch job 326 submitted

Job 326 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.326.out: No such file or directory
Test case:[OMB] run osu_latency on 2 nodes under resource manager (/gnu14/mpich)
Outcome:Failed
Duration:1.213 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 35)
  `run_mpi_binary -t "$CMD_TIMEOUT" "$EXE" "-m $MESSAGE_SIZE" 2 2' failed
job script = /tmp/job.ohpc-test.11371
Batch job 327 submitted

Job 327 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.327.out: No such file or directory

Test Suite: test_module

Results

Duration0.377 sec
Tests7
Failures0

Tests

test_module

Test case:[OMB] Verify OMB module is loaded and matches rpm version (gnu14-mpich)
Outcome:Passed
Duration:0.19 sec
FailedNone
None
Test case:[OMB] Verify OMB_DIR is defined and directory exists (gnu14-mpich)
Outcome:Passed
Duration:0.02 sec
FailedNone
None
Test case:[OMB] Verify osu_bw binary is available (gnu14-mpich)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[OMB] Verify osu_latency binary is available (gnu14-mpich)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[OMB] Verify osu_allgather binary is available (gnu14-mpich)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[OMB] Verify osu_get_bw binary is available (gnu14-mpich)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[OMB] Verify osu_put_latency binary is available (gnu14-mpich)
Outcome:Passed
Duration:0.033 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration2.45 sec
Tests2
Failures2

Tests

rm_execution

Test case:[OMB] run osu_bw on 2 nodes under resource manager (/gnu14/openmpi5)
Outcome:Failed
Duration:1.225 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 22)
  `run_mpi_binary -t "$CMD_TIMEOUT" "$EXE" "-m 512" 2 2' failed
job script = /tmp/job.ohpc-test.17546
Batch job 328 submitted

Job 328 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.328.out: No such file or directory
Test case:[OMB] run osu_latency on 2 nodes under resource manager (/gnu14/openmpi5)
Outcome:Failed
Duration:1.225 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 35)
  `run_mpi_binary -t "$CMD_TIMEOUT" "$EXE" "-m $MESSAGE_SIZE" 2 2' failed
job script = /tmp/job.ohpc-test.932
Batch job 329 submitted

Job 329 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.329.out: No such file or directory

Test Suite: test_module

Results

Duration0.383 sec
Tests7
Failures0

Tests

test_module

Test case:[OMB] Verify OMB module is loaded and matches rpm version (gnu14-openmpi5)
Outcome:Passed
Duration:0.194 sec
FailedNone
None
Test case:[OMB] Verify OMB_DIR is defined and directory exists (gnu14-openmpi5)
Outcome:Passed
Duration:0.021 sec
FailedNone
None
Test case:[OMB] Verify osu_bw binary is available (gnu14-openmpi5)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[OMB] Verify osu_latency binary is available (gnu14-openmpi5)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[OMB] Verify osu_allgather binary is available (gnu14-openmpi5)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[OMB] Verify osu_get_bw binary is available (gnu14-openmpi5)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[OMB] Verify osu_put_latency binary is available (gnu14-openmpi5)
Outcome:Passed
Duration:0.034 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration4.077 sec
Tests3
Failures3

Tests

rm_execution

Test case:[perf-tools/Scalasca] MPI C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.234 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 24)
  `run_mpi_binary -s 1 ./mpi/C_mpi_test $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.20619
Batch job 348 submitted

Job 348 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.348.out: No such file or directory
Test case:[perf-tools/Scalasca] MPI C++ binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.313 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 38)
  `run_mpi_binary -s 1 ./mpi/CXX_mpi_test $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.20743
Batch job 349 submitted

Job 349 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.349.out: No such file or directory
Test case:[perf-tools/Scalasca] Serial C OpenMP binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.53 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 52)
  `run_serial_binary -s 1 ./serial/C_omp_test' failed with status 2
S=C=A=N: Scalasca 2.6.1 trace collection and analysis
S=C=A=N: ./scorep_C_omp_test_1p1x2_trace experiment archive
S=C=A=N: Fri Mar 21 15:17:56 2025: Collect start
/bin/srun -n 1 -N 1 -t 1 ./serial/C_omp_test
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/perf-tools/scalasca/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/perf-tools/scalasca/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./serial/C_omp_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
S=C=A=N: Fri Mar 21 15:17:56 2025: Collect done (status=0) 0s
Abort: missing experiment archive ./scorep_C_omp_test_1p1x2_trace

Test Suite: test_module

Results

Duration0.52 sec
Tests11
Failures0

Tests

test_module

Test case:[perf-tools/Scalasca] Verify scalasca module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.198 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify module SCALASCA_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify module SCALASCA_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scalasca binary (gnu14/mpich)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scan binary (gnu14/mpich)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.mpi binary (gnu14/mpich)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.omp binary (gnu14/mpich)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.hyb binary (gnu14/mpich)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.ser binary (gnu14/mpich)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of square binary (gnu14/mpich)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of user guide for scalasca (gnu14/mpich)
Outcome:Passed
Duration:0.036 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration3.016 sec
Tests3
Failures3

Tests

rm_execution

Test case:[perf-tools/Scalasca] MPI C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.236 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 24)
  `run_mpi_binary -s 1 ./mpi/C_mpi_test $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.14476
Batch job 351 submitted

Job 351 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.351.out: No such file or directory
Test case:[perf-tools/Scalasca] MPI C++ binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.238 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 38)
  `run_mpi_binary -s 1 ./mpi/CXX_mpi_test $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.12074
Batch job 352 submitted

Job 352 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.352.out: No such file or directory
Test case:[perf-tools/Scalasca] Serial C OpenMP binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.542 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 52)
  `run_serial_binary -s 1 ./serial/C_omp_test' failed with status 2
S=C=A=N: Scalasca 2.6.1 trace collection and analysis
S=C=A=N: ./scorep_C_omp_test_1p1x2_trace experiment archive
S=C=A=N: Fri Mar 21 15:18:13 2025: Collect start
/bin/srun -n 1 -N 1 -t 1 ./serial/C_omp_test
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/perf-tools/scalasca/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/perf-tools/scalasca/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./serial/C_omp_test: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
S=C=A=N: Fri Mar 21 15:18:13 2025: Collect done (status=0) 0s
Abort: missing experiment archive ./scorep_C_omp_test_1p1x2_trace

Test Suite: test_module

Results

Duration0.529 sec
Tests11
Failures0

Tests

test_module

Test case:[perf-tools/Scalasca] Verify scalasca module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.203 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify module SCALASCA_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify module SCALASCA_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scalasca binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scan binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.mpi binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.omp binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.036 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.hyb binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of scout.ser binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of square binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Scalasca] Verify availability of user guide for scalasca (gnu14/openmpi5)
Outcome:Passed
Duration:0.036 sec
FailedNone
None

Test Suite: instrumenter_test

Results

Duration2.102 sec
Tests9
Failures0

Tests

instrumenter_test

Test case:[perf-tools/Score-P] MPI C binary includes expected Score-P instrumenter symbols (/gnu14/mpich)
Outcome:Passed
Duration:0.169 sec
FailedNone
None
Test case:[perf-tools/Score-P] MPI C++ binary includes expected Score-P instrumenter symbols (/gnu14/mpich)
Outcome:Passed
Duration:0.17 sec
FailedNone
None
Test case:[perf-tools/Score-P] MPI Fortran binary includes expected Score-P instrumenter symbols (/gnu14/mpich)
Outcome:Passed
Duration:0.167 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial C OpenMP binary includes expected Score-P instrumenter symbols (/gnu14/mpich)
Outcome:Passed
Duration:0.187 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial C++ OpenMP binary includes expected Score-P instrumenter symbols (/gnu14/mpich)
Outcome:Passed
Duration:0.187 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial Fortran OpenMP binary includes expected Score-P instrumenter symbols (/gnu14/mpich)
Outcome:Passed
Duration:0.188 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C binary includes expected Score-P instrumenter symbols (/gnu14/mpich)
Outcome:Passed
Duration:0.345 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C++ binary includes expected Score-P instrumenter symbols (/gnu14/mpich)
Outcome:Passed
Duration:0.346 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) Fortran binary includes expected Score-P instrumenter symbols (/gnu14/mpich)
Outcome:Passed
Duration:0.343 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration15.348 sec
Tests9
Failures9

Tests

rm_execution

Test case:[perf-tools/Score-P] MPI C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.237 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 45)
  `run_mpi_binary ./mpi/main_mpi_c $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.32258
Batch job 330 submitted

Job 330 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.330.out: No such file or directory
Test case:[perf-tools/Score-P] MPI C++ binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.311 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 62)
  `run_mpi_binary ./mpi/main_mpi_cxx $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.12466
Batch job 331 submitted

Job 331 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.331.out: No such file or directory
Test case:[perf-tools/Score-P] MPI Fortran binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.313 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 79)
  `run_mpi_binary ./mpi/main_mpi_fort $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.9895
Batch job 332 submitted

Job 332 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.332.out: No such file or directory
Test case:[perf-tools/Score-P] Serial C OpenMP binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.145 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 96)
  `run_serial_binary ./serial/main_omp_c' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/perf-tools/scorep/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/perf-tools/scorep/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./serial/main_omp_c: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[perf-tools/Score-P] Serial C++ OpenMP binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.128 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 113)
  `run_serial_binary ./serial/main_omp_cxx' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/perf-tools/scorep/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/perf-tools/scorep/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./serial/main_omp_cxx: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[perf-tools/Score-P] Serial Fortran OpenMP binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:0.134 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 130)
  `run_serial_binary ./serial/main_omp_fort' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/perf-tools/scorep/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/perf-tools/scorep/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./serial/main_omp_fort: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.382 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 147)
  `run_mpi_binary ./mpi/main_hyb_c $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.32547
Batch job 336 submitted

Job 336 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.336.out: No such file or directory
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C++ binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:2.313 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 164)
  `run_mpi_binary ./mpi/main_hyb_cxx $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.27362
Batch job 337 submitted

Job 337 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.337.out: No such file or directory
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) Fortran binary runs under resource manager (slurm/gnu14/mpich)
Outcome:Failed
Duration:3.385 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 181)
  `run_mpi_binary ./mpi/main_hyb_fort $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.31024
Batch job 338 submitted

Job 338 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.338.out: No such file or directory

Test Suite: test_module

Results

Duration0.747 sec
Tests14
Failures0

Tests

test_module

Test case:[perf-tools/Score-P] Verify scorep module is loaded and matches rpm version (gnu14/mpich)
Outcome:Passed
Duration:0.196 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify module SCOREP_DIR is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify module SCOREP_BIN is defined and exists (gnu14/mpich)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep binary (gnu14/mpich)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-backend-info binary (gnu14/mpich)
Outcome:Passed
Duration:0.036 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-config binary (gnu14/mpich)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-info binary (gnu14/mpich)
Outcome:Passed
Duration:0.036 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-preload-init binary (gnu14/mpich)
Outcome:Passed
Duration:0.033 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-score binary (gnu14/mpich)
Outcome:Passed
Duration:0.036 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-wrapper binary (gnu14/mpich)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep wrapper binaries (gnu14/mpich)
Outcome:Passed
Duration:0.158 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of user guide for scorep (gnu14/mpich)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of OPEN_ISSUES for scorep (gnu14/mpich)
Outcome:Passed
Duration:0.036 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of COPYING for scorep (gnu14/mpich)
Outcome:Passed
Duration:0.034 sec
FailedNone
None

Test Suite: instrumenter_test

Results

Duration2.221 sec
Tests9
Failures0

Tests

instrumenter_test

Test case:[perf-tools/Score-P] MPI C binary includes expected Score-P instrumenter symbols (/gnu14/openmpi5)
Outcome:Passed
Duration:0.18 sec
FailedNone
None
Test case:[perf-tools/Score-P] MPI C++ binary includes expected Score-P instrumenter symbols (/gnu14/openmpi5)
Outcome:Passed
Duration:0.183 sec
FailedNone
None
Test case:[perf-tools/Score-P] MPI Fortran binary includes expected Score-P instrumenter symbols (/gnu14/openmpi5)
Outcome:Passed
Duration:0.178 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial C OpenMP binary includes expected Score-P instrumenter symbols (/gnu14/openmpi5)
Outcome:Passed
Duration:0.192 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial C++ OpenMP binary includes expected Score-P instrumenter symbols (/gnu14/openmpi5)
Outcome:Passed
Duration:0.192 sec
FailedNone
None
Test case:[perf-tools/Score-P] Serial Fortran OpenMP binary includes expected Score-P instrumenter symbols (/gnu14/openmpi5)
Outcome:Passed
Duration:0.194 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C binary includes expected Score-P instrumenter symbols (/gnu14/openmpi5)
Outcome:Passed
Duration:0.371 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C++ binary includes expected Score-P instrumenter symbols (/gnu14/openmpi5)
Outcome:Passed
Duration:0.366 sec
FailedNone
None
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) Fortran binary includes expected Score-P instrumenter symbols (/gnu14/openmpi5)
Outcome:Passed
Duration:0.365 sec
FailedNone
None

Test Suite: rm_execution

Results

Duration14.35 sec
Tests9
Failures9

Tests

rm_execution

Test case:[perf-tools/Score-P] MPI C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.244 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 45)
  `run_mpi_binary ./mpi/main_mpi_c $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.29020
Batch job 339 submitted

Job 339 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.339.out: No such file or directory
Test case:[perf-tools/Score-P] MPI C++ binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.318 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 62)
  `run_mpi_binary ./mpi/main_mpi_cxx $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.4121
Batch job 340 submitted

Job 340 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.340.out: No such file or directory
Test case:[perf-tools/Score-P] MPI Fortran binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.324 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 79)
  `run_mpi_binary ./mpi/main_mpi_fort $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.24303
Batch job 341 submitted

Job 341 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.341.out: No such file or directory
Test case:[perf-tools/Score-P] Serial C OpenMP binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.153 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 96)
  `run_serial_binary ./serial/main_omp_c' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/perf-tools/scorep/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/perf-tools/scorep/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./serial/main_omp_c: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[perf-tools/Score-P] Serial C++ OpenMP binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.132 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 113)
  `run_serial_binary ./serial/main_omp_cxx' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/perf-tools/scorep/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/perf-tools/scorep/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./serial/main_omp_cxx: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[perf-tools/Score-P] Serial Fortran OpenMP binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:0.133 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file rm_execution, line 130)
  `run_serial_binary ./serial/main_omp_fort' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/perf-tools/scorep/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/perf-tools/scorep/tests': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./serial/main_omp_fort: No such file or directory
srun: error: c1: task 0: Exited with exit code 2
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:3.4 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 147)
  `run_mpi_binary ./mpi/main_hyb_c $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.18057
Batch job 345 submitted

Job 345 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.345.out: No such file or directory
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) C++ binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.324 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 164)
  `run_mpi_binary ./mpi/main_hyb_cxx $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.32585
Batch job 346 submitted

Job 346 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.346.out: No such file or directory
Test case:[perf-tools/Score-P] Hybrid (MPI+OpenMP) Fortran binary runs under resource manager (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:2.322 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file rm_execution, line 181)
  `run_mpi_binary ./mpi/main_hyb_fort $ARGS $NODES "$TASKS"' failed
job script = /tmp/job.ohpc-test.5669
Batch job 347 submitted

Job 347 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.347.out: No such file or directory

Test Suite: test_module

Results

Duration0.806 sec
Tests14
Failures0

Tests

test_module

Test case:[perf-tools/Score-P] Verify scorep module is loaded and matches rpm version (gnu14/openmpi5)
Outcome:Passed
Duration:0.203 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify module SCOREP_DIR is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.022 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify module SCOREP_BIN is defined and exists (gnu14/openmpi5)
Outcome:Passed
Duration:0.023 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-backend-info binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-config binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-info binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-preload-init binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.036 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-score binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep-wrapper binary (gnu14/openmpi5)
Outcome:Passed
Duration:0.036 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of scorep wrapper binaries (gnu14/openmpi5)
Outcome:Passed
Duration:0.205 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of user guide for scorep (gnu14/openmpi5)
Outcome:Passed
Duration:0.036 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of OPEN_ISSUES for scorep (gnu14/openmpi5)
Outcome:Passed
Duration:0.035 sec
FailedNone
None
Test case:[perf-tools/Score-P] Verify availability of COPYING for scorep (gnu14/openmpi5)
Outcome:Passed
Duration:0.036 sec
FailedNone
None

Test Suite: test_harness

Results

Duration3.44 sec
Tests3
Failures1

Tests

test_harness

Test case:[RMS/harness] Verify zero exit code from MPI job runs OK (slurm/gnu14/mpich)
Outcome:Failed
Duration:1.215 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file test_harness, line 23)
  `run_mpi_binary ./mpi_exit 0 $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.28062
Batch job 354 submitted

Job 354 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.354.out: No such file or directory
Test case:[RMS/harness] Verify non-zero exit code from MPI job detected as failure (slurm/gnu14/mpich)
Outcome:Passed
Duration:2.225 sec
FailedNone
None
Test case:[RMS/harness] Verify long-running MPI job terminates with timeout parameter (slurm/gnu14/mpich)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
disabled timeout test

Test Suite: test_harness

Results

Duration4.512 sec
Tests3
Failures1

Tests

test_harness

Test case:[RMS/harness] Verify zero exit code from MPI job runs OK (slurm/gnu14/openmpi5)
Outcome:Failed
Duration:1.219 sec
FailedNone
(from function `run_mpi_binary' in file ./common/functions, line 403,
 in test file test_harness, line 23)
  `run_mpi_binary ./mpi_exit 0 $NODES $TASKS' failed
job script = /tmp/job.ohpc-test.29818
Batch job 356 submitted

Job 356 failed...
Reason=RaisedSignal:53(Real-time_signal_19)

cat: job.356.out: No such file or directory
Test case:[RMS/harness] Verify non-zero exit code from MPI job detected as failure (slurm/gnu14/openmpi5)
Outcome:Passed
Duration:3.293 sec
FailedNone
None
Test case:[RMS/harness] Verify long-running MPI job terminates with timeout parameter (slurm/gnu14/openmpi5)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
disabled timeout test

Test Suite: run

Results

Duration3.57 sec
Tests5
Failures1

Tests

run

Test case:[charliecloud] check for RPM
Outcome:Passed
Duration:0.306 sec
FailedNone
None
Test case:[charliecloud] build image
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
skip experimental charliecloud image builder
Test case:[charliecloud] build alpine image from Docker (using singularity)
Outcome:Passed
Duration:2.536 sec
FailedNone
None
Test case:[charliecloud] exec image locally
Outcome:Passed
Duration:0.308 sec
FailedNone
None
Test case:[charliecloud] exec image via slurm
Outcome:Failed
Duration:0.42 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file run, line 70)
  `run_serial_binary timeout 20s ch-run alpine -- cat /etc/os-release' failed with status 127
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/runtimes/charliecloud': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/runtimes/charliecloud': No such file or directory: going to /tmp instead
/bin/timeout: failed to run command 'ch-run': No such file or directory
srun: error: c1: task 0: Exited with exit code 127

Test Suite: run

Results

Duration8.999 sec
Tests4
Failures1

Tests

run

Test case:[singularity] check for RPM
Outcome:Passed
Duration:0.03 sec
FailedNone
None
Test case:[singularity] pull down ubuntu docker image
Outcome:Passed
Duration:8.529 sec
FailedNone
None
Test case:[singularity] exec image
Outcome:Passed
Duration:0.278 sec
FailedNone
None
Test case:[singularity] exec image via slurm
Outcome:Failed
Duration:0.162 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file run, line 45)
  `run_serial_binary singularity exec ${IMAGE} cat /etc/os-release' failed with status 255
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/runtimes/singularity': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/runtimes/singularity': No such file or directory: going to /tmp instead
FATAL:   While checking image: could not open image /tmp/ubuntu.sif: failed to retrieve path for /tmp/ubuntu.sif: lstat /tmp/ubuntu.sif: no such file or directory
srun: error: c1: task 0: Exited with exit code 255

Test Suite: completion

Results

Duration0.093 sec
Tests1
Failures0

Tests

completion

Test case:[singularity] check for bash completion
Outcome:Passed
Duration:0.093 sec
FailedNone
None

Test Suite: ntp

Results

Duration0.313 sec
Tests3
Failures1

Tests

ntp

Test case:[ntp] check for chronyc binary
Outcome:Passed
Duration:0.034 sec
FailedNone
None
Test case:[ntp] verify local time in sync on SMS
Outcome:Passed
Duration:0.155 sec
FailedNone
None
Test case:[ntp] verify local time in sync on compute
Outcome:Failed
Duration:0.124 sec
FailedNone
(from function `run_serial_binary' in file ./common/functions, line 236,
 in test file ntp, line 29)
  `run_serial_binary ./check_time.sh' failed with status 2
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/time': No such file or directory: going to /tmp instead
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/time': No such file or directory: going to /tmp instead
slurmstepd-c1: error: execve(): /tmp/./check_time.sh: No such file or directory
srun: error: c1: task 0: Exited with exit code 2

Test Suite: mem_limits

Results

Duration0.348 sec
Tests2
Failures1

Tests

mem_limits

Test case:[memlock] check increased soft limit
Outcome:Failed
Duration:0.175 sec
FailedNone
(in test file mem_limits, line 28)
  `grep -v SOFT <"${OUTPUT}" | while IFS= read -r limit; do' failed
slurmstepd-c1: error: couldn't chdir to `/home/ohpc-test/tests/user-env': No such file or directory: going to /tmp instead
Test case:[memlock] check increased hard limit
Outcome:Passed
Duration:0.173 sec
FailedNone
None

Test Suite: pdsh

Results

Duration0.56 sec
Tests4
Failures0

Tests

pdsh

Test case:[pdsh] check for RPM
Outcome:Passed
Duration:0.052 sec
FailedNone
None
Test case:[pdsh] run a shell command on c[1-4]
Outcome:Passed
Duration:0.227 sec
FailedNone
None
Test case:[pdsh] check for pdsh-mod-slurm RPM
Outcome:Passed
Duration:0.051 sec
FailedNone
None
Test case:[pdsh] run a shell command on -P normal
Outcome:Passed
Duration:0.23 sec
FailedNone
None

Test Suite: ompi_info

Results

Duration0.645 sec
Tests1
Failures0

Tests

ompi_info

Test case:[openmpi] check for no output to stderr with ompi_info
Outcome:Passed
Duration:0.645 sec
FailedNone
None

Test Suite: magpie

Results

Duration0.92 sec
Tests3
Failures0

Tests

magpie

Test case:[magpie] check for RPM
Outcome:Passed
Duration:0.053 sec
FailedNone
None
Test case:[magpie] Verify MAGPIE module is loaded and matches rpm version
Outcome:Passed
Duration:0.518 sec
FailedNone
None
Test case:[magpie] Verify module MAGPIE_DIR is defined and exists
Outcome:Passed
Duration:0.349 sec
FailedNone
None

Test Suite: munge

Results

Duration1.145 sec
Tests4
Failures0

Tests

munge

Test case:[munge] check for OS provdied RPM
Outcome:Passed
Duration:0.054 sec
FailedNone
None
Test case:[munge] Generate a credential
Outcome:Passed
Duration:0.032 sec
FailedNone
None
Test case:[munge] Decode credential locally
Outcome:Passed
Duration:0.026 sec
FailedNone
None
Test case:[munge] Run benchmark
Outcome:Passed
Duration:1.033 sec
FailedNone
None

Test Suite: sacct

Results

Duration0.0 sec
Tests1
Failures0

Tests

sacct

Test case:[slurm] check for working sacct
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
Temporarily skip sacct check

Test Suite: sinfo

Results

Duration0.091 sec
Tests1
Failures0

Tests

sinfo

Test case:[slurm] Verify SLURM RPM version matches sinfo binary
Outcome:Passed
Duration:0.091 sec
FailedNone
None

Test Suite: oom

Results

Duration0.0 sec
Tests1
Failures0

Tests

oom

Test case:[oom] Test job OOM condition (gnu14/slurm)
Outcome:Skipped
Duration:0.0 sec
FailedNone
SkippedNone
None
disable to invistigate disk fillup