avocado-framework-tests / avocado-misc-tests Goto Github PK
View Code? Open in Web Editor NEWCommunity maintained Avocado tests repository
License: Other
Community maintained Avocado tests repository
License: Other
I executed tests on an embedded linux device (BeagleBone Black) using an Ubuntu 16.04.3 demo image. How shall I provide the results to you? (I am not sure if it is suitable to paste it here as people shouldn't but could use that in production?)
softwareraid with two disk mentioned in the yaml does not exit cleanly.
We see /dev/md-6 is still part of the software raid.
avocado run softwareraid.py -m softwareraid.py.data/softwareraid.yaml
JOB ID : eaf339b5e0084b99a2162905bcc25f833690e5a5
JOB LOG : /root/avocado/job-results/job-2017-04-24T06.05-eaf339b/job.log
(1/6) softwareraid.py:SoftwareRaid.test_run;raidlinear-scenario-f4df: PASS (0.40 s)
(2/6) softwareraid.py:SoftwareRaid.test_run;raid0-scenario-b22b: PASS (0.27 s)
(3/6) softwareraid.py:SoftwareRaid.test_run;raid1-scenario-308f: FAIL (0.11 s)
(4/6) softwareraid.py:SoftwareRaid.test_run;raid5-scenario-99ad: FAIL (0.10 s)
(5/6) softwareraid.py:SoftwareRaid.test_run;raid10-scenario-9424: FAIL (0.10 s)
(6/6) softwareraid.py:SoftwareRaid.test_run;raid6-scenario-93a1: FAIL (0.10 s)
RESULTS : PASS 2 | ERROR 0 | FAIL 4 | SKIP 0 | WARN 0 | INTERRUPT 0 | CANCEL 0
TESTS TIME : 1.09 s
/dev/md126:
Version : 1.2
Raid Level : raid0
Total Devices : 1
Persistence : Superblock is persistent
State : inactive
Name : linux-0a5l:mdsraid (local to host linux-0a5l)
UUID : 769cfcd5:d6fe6f19:60bd6c33:336befb9
Events : 0
Number Major Minor RaidDevice
- 254 6 - /dev/dm-6
scenario:
disk: /dev/dm-5 /dev/dm-6
raidlevel: !mux
raidlinear:
raid: linear
raid0:
raid: 0
raid1:
raid: 1
raid5:
raid: 5
raid10:
raid: 10
raid6:
raid: 6
Is it possible to define the configuration of all or several tests in one central place instead of in the test specific configuration files? E.g. test /io/pci/pci_hotplug.py is configured in the configuration file /io/pci/pci_hotplug.py.data.
Some tests like trinity (its README) can be harmful to the environment of the system under test (not only the system under test itself). It could be a good idea to inform potential users of this project right on the landing page (README) about that.
The tests for pci "enhanced error handling" are specific to the powerpc architecture:
$ avocado list ./io/pci
INSTRUMENTED ./io/pci/PowerNVEEH.py:PowerNVEEH.test_eeh_basic_pe
INSTRUMENTED ./io/pci/PowerVMEEH.py:PowerVMEEH.test_eeh_basic_pe
...
However they are not tagged with power
or some other suitable tag (power
seems to be used for another class of tests):
$ avocado list ./io/pci --filter-by-tags=power
$
They fail on other architectures (observed on ARMv7):
avocado run ./io/pci/PowerNVEEH.py:PowerNVEEH.test_eeh_basic_pe
JOB ID : f42b620a9dee655e5d2adef9812052e29ae8fce7
JOB LOG : /home/rc/avocado/job-results/job-2017-11-29T14.37-f42b620/job.log
(1/1) ./io/pci/PowerNVEEH.py:PowerNVEEH.test_eeh_basic_pe: ERROR (0.00 s)
RESULTS : PASS 0 | ERROR 1 | FAIL 0 | SKIP 0 | WARN 0 | INTERRUPT 0 | CANCEL 0
JOB TIME : 0.43 s
Excerpt from the log file /home/rc/avocado/job-results/job-2017-11-29T14.37-f42b620/job.log
:
2017-11-29 14:37:19,157 test L0965 ERROR| ERROR 1-./io/pci/PowerNVEEH.py:PowerNVEEH.test_eeh_basic_pe -> TestSetupFail: [Errno 13] Permission denied: '/sys/kernel/debug/powerpc/eeh_enable'
[stdout] Setting cpu: 68
[stdout] Setting cpu: 69
[stdout] Setting cpu: 70
[stdout] Setting cpu: 71
[stdout] Setting cpu: 72
[stdout] Setting cpu: 73
[stdout] Setting cpu: 74
[stdout] Setting cpu: 75
[stdout] Setting cpu: 76
[stdout] Setting cpu: 77
[stdout] Setting cpu: 78
[stdout] Setting cpu: 79
Running 'cpupower -c 0 frequency-info -f'
[stdout] analyzing CPU 0:
Command 'cpupower -c 0 frequency-info -f' finished with 0 after 0.00141215324402s
[stdout] current CPU frequency: 2593000 (asserted by call to kernel)
[stdout]
The userspace governor is working as expected
++++++++++++++++
++++++++++++++++
Running 'cpupower frequency-set -f '
[stderr] frequency-set: option requires an argument -- 'f'
[stdout] invalid or unknown argument
Command 'cpupower frequency-set -f ' finished with 234 after 0.00138282775879s
Reproduced traceback from: /usr/lib/python2.7/site-packages/avocado_framework-49.0-py2.7.egg/avocado/core/test.py:596
Traceback (most recent call last):
File "/root/avocado-misc-tests/cpu/cpupower.py", line 59, in test
self.check_governor(governor, min, max, cur)
File "/root/avocado-misc-tests/cpu/cpupower.py", line 103, in check_governor
self.check_userspace_governor(governor)
File "/root/avocado-misc-tests/cpu/cpupower.py", line 184, in check_userspace_governor
self.set_freq_val(self.get_random_freq())
File "/root/avocado-misc-tests/cpu/cpupower.py", line 143, in set_freq_val
output = process.run(cmd)
File "/usr/lib/python2.7/site-packages/avocado_framework-49.0-py2.7.egg/avocado/utils/process.py", line 1117, in run
raise CmdError(cmd, sp.result)
CmdError: Command 'cpupower frequency-set -f ' failed (rc=234)
Local variables:
-> cur <type 'str'>: 2394000
-> min <type 'str'>: 2061000
-> governors <type 'list'>: ['conservative', 'userspace', 'powersave', 'ondemand', 'performance']
-> max <type 'str'>: 3690000
-> self <class 'cpupower.cpupower'>: 1-cpu/cpupower.py:cpupower.test
-> governor <type 'str'>: userspace
-> initial_governor <type 'str'>: userspace
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/avocado_framework-49.0-py2.7.egg/avocado/core/test.py", line 667, in _run_avocado
raise test_exception
CmdError: Command 'cpupower frequency-set -f ' failed (rc=234)
ERROR 1-cpu/cpupower.py:cpupower.test -> CmdError: Command 'cpupower frequency-set -f ' failed (rc=234)
test case mandates to supply a disk, and it fails if we do not have extra disk in machine.
if disk is not given, than try to continue test on default dir. as long as test do not fill the root file system its fine, else we have to limit the test run to use minimum disk space available on disk
Thanks
The test suite generic/lshw.py
fail as ERROR
with TestSetupFail: Fail to install lshw required for this test.
because the package lshw
is not installed during setup. After manual installation with sudo apt-get install lshw
on Ubuntu the tests run without any problems. (Lshwrun.test_lshw_verification
and Lshwrun.test_lshw_options
require explicit configuration.)
The test CANCEL
s due to a not existing pci device.
$ avocado run avocado-misc-tests/io/driver/driver_bind_test.py:DriverBindTest.test --tap -
1..1
# debug.log of avocado-misc-tests/io/driver/driver_bind_test.py:DriverBindTest.test:
ok 1 avocado-misc-tests/io/driver/driver_bind_test.py:DriverBindTest.test # CANCEL 0001:01:00.0 does not exist
Seems like the used pci device in the test leads to this CANCEL
(source code of the corresponding test setup). Would it be reasonable to make the used default pci dev somehow configurable?
travis is failing as PEP8 issue
[root@localhost avocado-misc-tests]# inspekt style
PEP8 disabled: E501,E265,W601,E402
/root/avocado-misc-tests/cpu/ebizzy.py:95:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/cpu/ebizzy.py
/root/avocado-misc-tests/cpu/pmqa.py:70:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/cpu/pmqa.py
/root/avocado-misc-tests/cpu/sensors.py:115:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/cpu/sensors.py
/root/avocado-misc-tests/fs/filebench.py:83:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/fs/filebench.py
/root/avocado-misc-tests/fs/xfstests.py:187:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/fs/xfstests.py
/root/avocado-misc-tests/fuzz/fsfuzzer.py:89:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/fuzz/fsfuzzer.py
/root/avocado-misc-tests/fuzz/trinity.py:99:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/fuzz/trinity.py
/root/avocado-misc-tests/generic/criu.py:61:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/generic/criu.py
/root/avocado-misc-tests/generic/gdb.py:63:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/generic/gdb.py
/root/avocado-misc-tests/generic/interbench.py:72:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/generic/interbench.py
/root/avocado-misc-tests/generic/ltp.py:88:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/generic/ltp.py
/root/avocado-misc-tests/generic/oprofile.py:58:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/generic/oprofile.py
/root/avocado-misc-tests/generic/rcutorture.py:135:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/generic/rcutorture.py
/root/avocado-misc-tests/generic/service_check.py:74:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/generic/service_check.py
/root/avocado-misc-tests/io/disk/dbench.py:88:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/io/disk/dbench.py
/root/avocado-misc-tests/io/disk/fs_mark.py:66:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/io/disk/fs_mark.py
/root/avocado-misc-tests/io/disk/lvsetup.py:113:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/io/disk/lvsetup.py
/root/avocado-misc-tests/io/disk/softwareraid.py:137:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/io/disk/softwareraid.py
/root/avocado-misc-tests/io/disk/tiobench.py:78:1: E305 expected 2 blank lines after class or function definition, found 0
Style check fail: /root/avocado-misc-tests/io/disk/tiobench.py
/root/avocado-misc-tests/io/disk/ssd/blkdiscard.py:77:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/io/disk/ssd/blkdiscard.py
/root/avocado-misc-tests/io/disk/ssd/nvmetest.py:178:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/io/disk/ssd/nvmetest.py
/root/avocado-misc-tests/io/net/net_tools.py:349:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/io/net/net_tools.py
/root/avocado-misc-tests/io/net/pktgen.py:110:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/io/net/pktgen.py
/root/avocado-misc-tests/kernel/posixtest.py:67:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/kernel/posixtest.py
/root/avocado-misc-tests/kernel/rmaptest.py:85:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/kernel/rmaptest.py
/root/avocado-misc-tests/memory/libhugetlbfs.py:189:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/memory/libhugetlbfs.py
/root/avocado-misc-tests/memory/stutter.py:75:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/memory/stutter.py
/root/avocado-misc-tests/perf/hackbench.py:82:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/perf/hackbench.py
/root/avocado-misc-tests/perf/lmbench.py:109:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/perf/lmbench.py
/root/avocado-misc-tests/perf/perftool.py:64:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/perf/perftool.py
/root/avocado-misc-tests/perf/rt_tests.py:66:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/perf/rt_tests.py
/root/avocado-misc-tests/perf/stress.py:107:1: E305 expected 2 blank lines after class or function definition, found 1
Style check fail: /root/avocado-misc-tests/perf/stress.py
/root/avocado-misc-tests/perf/unixbench.py:64:9: E741 ambiguous variable name 'l'
/root/avocado-misc-tests/perf/unixbench.py:65:12: E741 ambiguous variable name 'l'
Style check fail: /root/avocado-misc-tests/perf/unixbench.py
PEP8 compliance FAIL
[root@localhost avocado-misc-tests]
I am trying to run a testcase with -m option, and avocado throwing an error, avocado run: error: unrecognized arguments: -m PowerVMEEH.py.data/PowerVMEEH.yaml
usage: avocado run [-h] [-d] [--force-job-id UNIQUE_JOB_ID]
[--job-results-dir DIRECTORY] [--job-timeout SECONDS]
[--failfast {on,off}] [--keep-tmp {on,off}]
[--ignore-missing-references {on,off}] [--sysinfo {on,off}]
[--execution-order {tests-per-variant,variants-per-test}]
[-s] [--show-job-log]
[--store-logging-stream [STREAM[:LEVEL] [STREAM[:LEVEL]
...]]] [--output-check-record {none,all,stdout,stderr}]
[--output-check {on,off}]
[--loaders [LOADERS [LOADERS ...]]]
[--external-runner EXECUTABLE]
[--external-runner-chdir {runner,test}]
[--external-runner-testdir DIRECTORY]
[--filter-by-tags TAGS] [--filter-by-tags-include-empty]
[--env-keep ENV_KEEP]
[--gdb-run-bin EXECUTABLE[:BREAKPOINT]]
[--gdb-prerun-commands EXECUTABLE:COMMANDS]
[--gdb-coredump {on,off}] [--journal] [--json FILE]
[--json-job-result {on,off}] [--replay REPLAY_JOBID]
[--replay-test-status REPLAY_TESTSTATUS]
[--replay-ignore REPLAY_IGNORE] [--replay-resume]
[--tap FILE] [--tap-job-result {on,off}]
[--wrapper SCRIPT[:EXECUTABLE]] [--xunit FILE]
[--xunit-job-result {on,off}] [-z]
[TEST_REFERENCE [TEST_REFERENCE ...]]
positional arguments:
TEST_REFERENCE List of test references (aliases or paths)
optional arguments:
-h, --help show this help message and exit
-d, --dry-run Instead of running the test only list them and log
their params.
--force-job-id UNIQUE_JOB_ID
Forces the use of a particular job ID. Used internally
when interacting with an avocado server. You should
not use this option unless you know exactly what
you're doing
--job-results-dir DIRECTORY
Forces to use of an alternate job results directory.
--job-timeout SECONDS
Set the maximum amount of time (in SECONDS) that tests
are allowed to execute. Values <= zero means "no
timeout". You can also use suffixes, like: s
(seconds), m (minutes), h (hours).
--failfast {on,off} Enable or disable the job interruption on first failed
test.
--keep-tmp {on,off} Keep job temporary files (useful for avocado
debugging). Defaults to off.
--ignore-missing-references {on,off}
Force the job execution, even if some of the test
references are not resolved to tests.
--sysinfo {on,off} Enable or disable system information (hardware
details, profilers, etc.). Current: on
--execution-order {tests-per-variant,variants-per-test}
Defines the order of iterating through test suite and
test variants
output and result format:
-s, --silent Silence stdout
--show-job-log Display only the job log on stdout. Useful for test
debugging purposes. No output will be displayed if you
also specify --silent
--store-logging-stream [STREAM[:LEVEL] [STREAM[:LEVEL] ...]]
Store given logging STREAMs in
$JOB_RESULTS_DIR/$STREAM.$LEVEL.
--journal Records test status changes (for use with avocado-
journal-replay and avocado-server)
--json FILE Enable JSON result format and write it to FILE. Use
'-' to redirect to the standard output.
--json-job-result {on,off}
Enables default JSON result in the job results
directory. File will be named "results.json".
--tap FILE Enable TAP result output and write it to FILE. Use '-'
to redirect to the standard output.
--tap-job-result {on,off}
Enables default TAP result in the job results
directory. File will be named "results.tap".
--xunit FILE Enable xUnit result format and write it to FILE. Use
'-' to redirect to the standard output.
--xunit-job-result {on,off}
Enables default xUnit result in the job results
directory. File will be named "results.xml".
-z, --archive Archive (ZIP) files generated by tests
output check arguments:
--output-check-record {none,all,stdout,stderr}
Record output streams of your tests to reference files
(valid options: none (do not record output streams),
all (record both stdout and stderr), stdout (record
only stderr), stderr (record only stderr). Current:
none
--output-check {on,off}
Enable or disable test output (stdout/stderr) check.
If this option is off, no output will be checked, even
if there are reference files present for the test.
Current: on (output check enabled)
loader options:
--loaders [LOADERS [LOADERS ...]]
Overrides the priority of the test loaders. You can
specify either @loader_name or TEST_TYPE. By default
it tries all available loaders according to priority
set in settings->plugins.loaders.
--external-runner EXECUTABLE
Path to an specific test runner that allows the use of
its own tests. This should be used for running tests
that do not conform to Avocado' SIMPLE testinterface
and can not run standalone. Note: the use of
--external-runner overwrites the --loaders to
"external_runner"
--external-runner-chdir {runner,test}
Change directory before executing tests. This option
may be necessary because of requirements and/or
limitations of the external test runner. If the
external runner requires to be run from its own base
directory,use "runner" here. If the external runner
runs tests based on files and requires to be run from
the directory where those files are located, use
"test" here and specify the test directory with the
option "--external-runner-testdir". Defaults to "None"
--external-runner-testdir DIRECTORY
Where test files understood by the external test
runner are located in the filesystem. Obviously this
assumes and only applies to external test runners that
run tests from files
filtering parameters:
--filter-by-tags TAGS
Filter INSTRUMENTED tests based on ":avocado:
tags=tag1,tag2" notation in their class docstring
--filter-by-tags-include-empty
Include all tests without tags during filtering. This
effectively means they will be kept in the test suite
found previously to filtering.
keep environment variables:
--env-keep ENV_KEEP Keep environment variables in remote executions
GNU Debugger support:
--gdb-run-bin EXECUTABLE[:BREAKPOINT]
Run a given executable inside the GNU debugger,
pausing at a given breakpoint (defaults to "main")
--gdb-prerun-commands EXECUTABLE:COMMANDS
After loading an executable in GDB, but before
actually running it, execute the GDB commands in the
given file. EXECUTABLE is optional, if omitted
COMMANDS will apply to all executables
--gdb-coredump {on,off}
Automatically generate a core dump when the inferior
process received a fatal signal such as SIGSEGV or
SIGABRT
job replay:
--replay REPLAY_JOBID
Replay a job identified by its (partial) hash id. Use
"--replay latest" to replay the latest job.
--replay-test-status REPLAY_TESTSTATUS
Filter tests to replay by test status
--replay-ignore REPLAY_IGNORE
Ignore variants (variants) and/or configuration
(config) from the source job
--replay-resume Resume an interrupted job
wrapper support:
--wrapper SCRIPT[:EXECUTABLE]
Use a script to wrap executables run by a test. The
wrapper is either a path to a script (AKA a global
wrapper) or a path to a script followed by colon
symbol (:), plus a shell like glob to the target
EXECUTABLE. Multiple wrapper options are allowed, but
only one global wrapper can be defined.
avocado run: error: unrecognized arguments: -m PowerVMEEH.py.data/PowerVMEEH.yaml
[root@ltcalpine-lp3 pci]#
Version: Avocado 50.0
The test generic/sysbench.py:Sysbench.test
does not install sysbench
during setup. A manual installation (on Ubuntu with sudo apt-get install sysbench
) is required.
should we limit the test to run for given time or file size ?
The test ./io/disk/smartctl.py:SmartctlTest.test
is not suitable for systems without a disk with support for SMART (a lot of ATA/SATA and SCSI/SAS hard drives and solid-state drives). A CANCEL
would be suitable.
I installed Avocado (v55.0) from PyPi and added any additional dependencies with sudo pip install -r https://raw.githubusercontent.com/avocado-framework/avocado/master/requirements.txt
as well. If I run the IO net test suite with avocado run ~/avocado-misc-tests-master/io/net
an import error is raised during the execution of the test HtxNicTest.test
.
The relevant log file content:
2017-11-09 11:28:28,958 test L0965 ERROR| ERROR 04-/home/rc/avocado-misc-tests-master/io/net/htx_nic_devices.py:HtxNicTest.test -> TestError: Traceback (most recent call last):
File "/home/rc/avocado-misc-tests-master/io/net/htx_nic_devices.py", line 23, in <module>
from pexpect import pxssh
ImportError: No module named pexpect
Script arcconf_cntl_oper script skipping the FW flash test case due to improper checking.
2017-01-30 04:06:31,040 download L0062 INFO | Fetching http://9.47.67.137/repo/firmware/RAID/arcconf/tool/Arcconf-2.02-22404.ppc64el.deb -> /usr/share/avocado/data/cache/Arcconf-2.02-22404.ppc64el.deb.NuO94t
2017-01-30 04:06:31,104 process L0368 INFO | Running 'dpkg -i /usr/share/avocado/data/cache/Arcconf-2.02-22404.ppc64el.deb'
2017-01-30 04:06:31,235 process L0458 DEBUG| [stdout] Selecting previously unselected package arcconf.
2017-01-30 04:06:31,260 process L0458 DEBUG| [stdout] (Reading database ... 83162 files and directories currently installed.)
2017-01-30 04:06:31,262 process L0458 DEBUG| [stdout] Preparing to unpack .../Arcconf-2.02-22404.ppc64el.deb ...
2017-01-30 04:06:31,310 process L0458 DEBUG| [stdout] Unpacking arcconf (2.02-22404) ...
2017-01-30 04:06:31,638 process L0458 DEBUG| [stdout] Setting up arcconf (2.02-22404) ...
2017-01-30 04:06:31,731 process L0458 DEBUG| [stdout]
2017-01-30 04:06:31,734 process L0458 DEBUG| [stdout] Arcconf is located at /opt/pmcs/cli
2017-01-30 04:06:31,873 process L0478 INFO | Command 'dpkg -i /usr/share/avocado/data/cache/Arcconf-2.02-22404.ppc64el.deb' finished with 0 after 0.767499923706s
2017-01-30 04:06:31,874 stacktrace L0038 ERROR|
2017-01-30 04:06:31,874 stacktrace L0041 ERROR| Reproduced traceback from: /usr/local/lib/python2.7/dist-packages/avocado_framework-45.0-py2.7.egg/avocado/core/test.py:470
2017-01-30 04:06:31,874 stacktrace L0044 ERROR| Traceback (most recent call last):
2017-01-30 04:06:31,875 stacktrace L0044 ERROR| File "/var/lib/libvirt/images/workspace/PMC_Sierra/avocado-misc-tests/io/disk/arcconf/arcconf_cntl_oper.py", line 87, in setUp
2017-01-30 04:06:31,875 stacktrace L0044 ERROR| self.skip("Unable to install arcconf")
2017-01-30 04:06:31,875 stacktrace L0044 ERROR| File "/usr/local/lib/python2.7/dist-packages/avocado_framework-45.0-py2.7.egg/avocado/core/test.py", line 672, in skip
2017-01-30 04:06:31,875 stacktrace L0044 ERROR| raise exceptions.TestSkipError(message)
2017-01-30 04:06:31,875 stacktrace L0044 ERROR| TestSkipError: Unable to install arcconf
When executing the DlparPci.test
on other systems than POWER4-based systems (observed on ARMv7) the test fails:
$ avocado run ./io/pci/dlpar.py:DlparPci.test
JOB ID : e931b482c930a16041ee940377fba9558eac3965
JOB LOG : /home/rc/avocado/job-results/job-2017-11-29T15.03-e931b48/job.log
(1/1) ./io/pci/dlpar.py:DlparPci.test: ERROR (0.29 s)
RESULTS : PASS 0 | ERROR 1 | FAIL 0 | SKIP 0 | WARN 0 | INTERRUPT 0 | CANCEL 0
JOB TIME : 0.73 s
Excerpt from the log file:
2017-11-29 15:03:35,253 test L0965 ERROR| ERROR 1-./io/pci/dlpar.py:DlparPci.test -> TestSetupFail: 'NoneType' object is not iterable
git_repo() in avocado/utils/git.py is cloning the files directly of upstream dir instead it has to clone the dir which contains the files
It crashes like below.
alp3:~ # avocado --help
Avocado crashed unexpectedly: 'module' object has no attribute 'NullHandler'
You can find details in /tmp/avocado-traceback-2017-04-05_02:32:43-i8q2Rb.log
Avocado_log.txt
Followed below steps:
Installed git gcc python-devel python-pip libvirt-devel libyaml-devel xz-devel packages.
git clone git://github.com/avocado-framework/avocado.git
cd avocado
sudo make requirements
sudo python setup.py install
The test generic/hwinfo.py:HWInfo.test
depends on hwinfo
which is not available for Ubuntu 14.04 out of the box. When executed the test status is ERROR
.
I see some tests and utilities in avocado-vt/virttest.
If I want to use them and write tests in avocado-misc-tests, what is the recommended way ?
Can I reference them directly ?
Thanks
Narasimhan V
When executing the test suite io/disk
with avocado run ./io/disk
and default configuration (no modification of yaml files, etc.) the tests for the Avago storage adapter FAIL
(with io_disk.log
as log file of the test suite run):
$ cat io_disk.log | sed -n -e 's/^.*| FAIL //p'
...
31-./io/disk/Avago_storage_adapter/avago9361.py:Avago9361.test_display -> TestFail: Failed to display the version of the tool
32-./io/disk/Avago_storage_adapter/avago9361.py:Avago9361.test_adjustablerates -> TestFail: Failed to show the rate
33-./io/disk/Avago_storage_adapter/avago9361.py:Avago9361.test_set_on_off -> TestFail: Failed to show the deatils of {0}restorehotspare
If I run the IO driver test suite with avocado run ~/avocado-misc-tests-master/io/driver
(I downloaded and unzipped the test repository into ~/avocado-misc-tests-master
) the execution of test 3 leads to an error:
2017-11-09 10:46:57,002 test L0965 ERROR| ERROR 3-/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/README.txt -> OSError: [Errno 8] Exec format error (/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/README.txt)
Seems like the test runner tries to execute /home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/README.txt
which is indicated to be a test but actually isn't:
avocado list ~/avocado-misc-tests-master/io/driver
INSTRUMENTED /home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py:DriverBindTest.test
SIMPLE /home/rc/avocado-misc-tests-master/io/driver/module_unload_load.sh
SIMPLE /home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/README.txt
SIMPLE /home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/driver_bind_test.yaml
The relevant log file content:
2017-11-09 10:46:55,777 test L0381 INFO | START 3-/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/README.txt
2017-11-09 10:46:55,795 test L0402 DEBUG| Test metadata:
2017-11-09 10:46:55,813 test L0403 DEBUG| filename: /home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/README.txt
2017-11-09 10:46:55,908 process L0389 INFO | Running '/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/README.txt'
2017-11-09 10:46:56,045 stacktrace L0041 ERROR|
2017-11-09 10:46:56,049 stacktrace L0044 ERROR| Reproduced traceback from: /usr/local/lib/python2.7/dist-packages/avocado/core/test.py:817
2017-11-09 10:46:56,081 stacktrace L0047 ERROR| Traceback (most recent call last):
2017-11-09 10:46:56,087 stacktrace L0047 ERROR| File "/usr/local/lib/python2.7/dist-packages/avocado/core/test.py", line 1113, in test
2017-11-09 10:46:56,089 stacktrace L0047 ERROR| self._execute_cmd()
2017-11-09 10:46:56,092 stacktrace L0047 ERROR| File "/usr/local/lib/python2.7/dist-packages/avocado/core/test.py", line 1097, in _execute_cmd
2017-11-09 10:46:56,094 stacktrace L0047 ERROR| env=test_params)
2017-11-09 10:46:56,112 stacktrace L0047 ERROR| File "/usr/local/lib/python2.7/dist-packages/avocado/utils/process.py", line 1114, in run
2017-11-09 10:46:56,115 stacktrace L0047 ERROR| cmd_result = sp.run(timeout=timeout)
2017-11-09 10:46:56,127 stacktrace L0047 ERROR| File "/usr/local/lib/python2.7/dist-packages/avocado/utils/process.py", line 638, in run
2017-11-09 10:46:56,130 stacktrace L0047 ERROR| self._init_subprocess()
2017-11-09 10:46:56,132 stacktrace L0047 ERROR| File "/usr/local/lib/python2.7/dist-packages/avocado/utils/process.py", line 402, in _init_subprocess
2017-11-09 10:46:56,135 stacktrace L0047 ERROR| raise details
2017-11-09 10:46:56,137 stacktrace L0047 ERROR| OSError: [Errno 8] Exec format error (/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/README.txt)
2017-11-09 10:46:56,149 stacktrace L0048 ERROR|
2017-11-09 10:46:56,152 test L0822 DEBUG| Local variables:
2017-11-09 10:46:56,959 test L0825 DEBUG| -> self <class 'avocado.core.test.SimpleTest'>: 3-/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/README.txt
2017-11-09 10:46:56,965 test L0278 DEBUG| DATA (filename=stdout.expected) => NOT FOUND (data sources: variant, file)
2017-11-09 10:46:56,982 test L0278 DEBUG| DATA (filename=stderr.expected) => NOT FOUND (data sources: variant, file)
2017-11-09 10:46:56,986 test L0950 ERROR| Traceback (most recent call last):
2017-11-09 10:46:56,988 test L0950 ERROR| File "/usr/local/lib/python2.7/dist-packages/avocado/core/test.py", line 888, in _run_avocado
raise test_exception
2017-11-09 10:46:56,990 test L0950 ERROR| OSError: [Errno 8] Exec format error (/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/README.txt)
2017-11-09 10:46:57,002 test L0965 ERROR| ERROR 3-/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/README.txt -> OSError: [Errno 8] Exec format error (/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/README.txt)
2017-11-09 10:46:57,004 test L0954 INFO |
2017-11-09 10:46:57,208 sysinfo L0415 INFO | Commands configured by file: /etc/avocado/sysinfo/commands
2017-11-09 10:46:57,231 sysinfo L0426 INFO | Files configured by file: /etc/avocado/sysinfo/files
2017-11-09 10:46:57,237 sysinfo L0446 INFO | Profilers configured by file: /etc/avocado/sysinfo/profilers
2017-11-09 10:46:57,239 sysinfo L0454 INFO | Profiler disabled
2017-11-09 10:46:57,461 sysinfo L0266 DEBUG| Journalctl collection failed: [Errno 2] No such file or directory (journalctl --quiet --lines 1 --output json)
2017-11-09 10:46:57,474 varianter L0116 DEBUG| PARAMS (key=timeout, path=*, default=None) => None
2017-11-09 10:46:57,487 test L0381 INFO | START 4-/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/driver_bind_test.yaml
The test generic/nstress.py:NStress.test
fails if run on Ubuntu + ARMv7 arch. Is this test suitable for non-AIX systems? ( IBM developerworks - nstress / wikipedia - IBM AIX)
$ avocado run generic/nstress.py:NStress.test
JOB ID : 37fe342a01080cc3f9b14ea5c5f72850deb488b9
JOB LOG : /home/rc/avocado/job-results/job-2017-12-01T11.21-37fe342/job.log
(1/1) generic/nstress.py:NStress.test: FAIL (1.17 s)
RESULTS : PASS 0 | ERROR 0 | FAIL 1 | SKIP 0 | WARN 0 | INTERRUPT 0 | CANCEL 0
JOB TIME : 4.55 s
rc@rc-visard-02911982:~/avocado-misc-tests$ cat /home/rc/avocado/job-results/job-2017-12-01T11.21-37fe342/job.log
2017-12-01 11:21:27,038 extension L0189 DEBUG| found extension EntryPoint.parse('journal = avocado.plugins.journal:JournalResult')
2017-12-01 11:21:27,041 extension L0189 DEBUG| found extension EntryPoint.parse('tap = avocado.plugins.tap:TAPResult')
2017-12-01 11:21:27,044 extension L0189 DEBUG| found extension EntryPoint.parse('human = avocado.plugins.human:Human')
2017-12-01 11:21:27,061 extension L0189 DEBUG| found extension EntryPoint.parse('teststmpdir = avocado.plugins.teststmpdir:TestsTmpDir')
2017-12-01 11:21:27,069 extension L0189 DEBUG| found extension EntryPoint.parse('jobscripts = avocado.plugins.jobscripts:JobScripts')
2017-12-01 11:21:27,076 extension L0189 DEBUG| found extension EntryPoint.parse('human = avocado.plugins.human:HumanJob')
2017-12-01 11:21:27,127 sysinfo L0415 INFO | Commands configured by file: /etc/avocado/sysinfo/commands
2017-12-01 11:21:27,130 sysinfo L0426 INFO | Files configured by file: /etc/avocado/sysinfo/files
2017-12-01 11:21:27,133 sysinfo L0446 INFO | Profilers configured by file: /etc/avocado/sysinfo/profilers
2017-12-01 11:21:27,135 sysinfo L0454 INFO | Profiler disabled
2017-12-01 11:21:27,216 sysinfo L0266 DEBUG| Journalctl collection failed: [Errno 2] No such file or directory (journalctl --quiet --lines 1 --output json)
2017-12-01 11:21:27,219 job L0327 INFO | Command line: /usr/local/bin/avocado run generic/nstress.py:NStress.test
2017-12-01 11:21:27,221 job L0328 INFO |
2017-12-01 11:21:27,282 job L0357 INFO | Avocado version: 55.0
2017-12-01 11:21:27,285 job L0358 INFO |
2017-12-01 11:21:27,286 job L0362 INFO | Config files read (in order):
2017-12-01 11:21:27,288 job L0364 INFO | /etc/avocado/avocado.conf
2017-12-01 11:21:27,289 job L0364 INFO | /etc/avocado/conf.d/gdb.conf
2017-12-01 11:21:27,292 job L0364 INFO | /home/rc/.config/avocado/avocado.conf
2017-12-01 11:21:27,293 job L0369 INFO |
2017-12-01 11:21:27,294 job L0371 INFO | Avocado config:
2017-12-01 11:21:27,312 job L0380 INFO | Section.Key Value
2017-12-01 11:21:27,313 job L0380 INFO | datadir.paths.base_dir /var/lib/avocado
2017-12-01 11:21:27,315 job L0380 INFO | datadir.paths.test_dir /usr/share/avocado/tests
2017-12-01 11:21:27,317 job L0380 INFO | datadir.paths.data_dir /var/lib/avocado/data
2017-12-01 11:21:27,318 job L0380 INFO | datadir.paths.logs_dir ~/avocado/job-results
2017-12-01 11:21:27,319 job L0380 INFO | sysinfo.collect.enabled True
2017-12-01 11:21:27,321 job L0380 INFO | sysinfo.collect.commands_timeout -1
2017-12-01 11:21:27,323 job L0380 INFO | sysinfo.collect.installed_packages False
2017-12-01 11:21:27,324 job L0380 INFO | sysinfo.collect.profiler False
2017-12-01 11:21:27,325 job L0380 INFO | sysinfo.collect.locale C
2017-12-01 11:21:27,327 job L0380 INFO | sysinfo.collect.per_test False
2017-12-01 11:21:27,328 job L0380 INFO | sysinfo.collectibles.commands /etc/avocado/sysinfo/commands
2017-12-01 11:21:27,329 job L0380 INFO | sysinfo.collectibles.files /etc/avocado/sysinfo/files
2017-12-01 11:21:27,331 job L0380 INFO | sysinfo.collectibles.profilers /etc/avocado/sysinfo/profilers
2017-12-01 11:21:27,333 job L0380 INFO | runner.output.colored True
2017-12-01 11:21:27,335 job L0380 INFO | runner.output.utf8
2017-12-01 11:21:27,336 job L0380 INFO | remoter.behavior.reject_unknown_hosts False
2017-12-01 11:21:27,338 job L0380 INFO | remoter.behavior.disable_known_hosts False
2017-12-01 11:21:27,340 job L0380 INFO | job.output.loglevel debug
2017-12-01 11:21:27,342 job L0380 INFO | restclient.connection.hostname localhost
2017-12-01 11:21:27,343 job L0380 INFO | restclient.connection.port 9405
2017-12-01 11:21:27,345 job L0380 INFO | restclient.connection.username
2017-12-01 11:21:27,347 job L0380 INFO | restclient.connection.password
2017-12-01 11:21:27,349 job L0380 INFO | plugins.disable []
2017-12-01 11:21:27,350 job L0380 INFO | plugins.skip_broken_plugin_notification []
2017-12-01 11:21:27,352 job L0380 INFO | plugins.loaders ['file', '@DEFAULT']
2017-12-01 11:21:27,353 job L0380 INFO | gdb.paths.gdb /usr/bin/gdb
2017-12-01 11:21:27,355 job L0380 INFO | gdb.paths.gdbserver /usr/bin/gdbserver
2017-12-01 11:21:27,356 job L0381 INFO |
2017-12-01 11:21:27,358 job L0384 INFO | Avocado Data Directories:
2017-12-01 11:21:27,359 job L0385 INFO |
2017-12-01 11:21:27,366 job L0386 INFO | base /home/rc/avocado
2017-12-01 11:21:27,368 job L0387 INFO | tests /usr/share/avocado/tests
2017-12-01 11:21:27,374 job L0388 INFO | data /home/rc/avocado/data
2017-12-01 11:21:27,376 job L0389 INFO | logs /home/rc/avocado/job-results/job-2017-12-01T11.21-37fe342
2017-12-01 11:21:27,378 job L0390 INFO |
2017-12-01 11:21:27,381 job L0396 INFO | No variants available, using defaults only
2017-12-01 11:21:27,382 job L0396 INFO |
2017-12-01 11:21:27,384 job L0396 INFO | Variant : /
2017-12-01 11:21:27,386 job L0400 INFO | Temporary dir: /var/tmp/avocado_ysJFfS
2017-12-01 11:21:27,387 job L0401 INFO |
2017-12-01 11:21:27,388 job L0319 INFO | Job ID: 37fe342a01080cc3f9b14ea5c5f72850deb488b9
2017-12-01 11:21:27,390 job L0322 INFO |
2017-12-01 11:21:28,038 sysinfo L0111 DEBUG| Not logging /proc/pci (file does not exist)
2017-12-01 11:21:28,110 sysinfo L0109 DEBUG| Not logging /proc/slabinfo (lack of permissions)
2017-12-01 11:21:28,264 sysinfo L0111 DEBUG| Not logging /sys/kernel/debug/sched_features (file does not exist)
2017-12-01 11:21:28,778 sysinfo L0415 INFO | Commands configured by file: /etc/avocado/sysinfo/commands
2017-12-01 11:21:28,787 sysinfo L0426 INFO | Files configured by file: /etc/avocado/sysinfo/files
2017-12-01 11:21:28,790 sysinfo L0446 INFO | Profilers configured by file: /etc/avocado/sysinfo/profilers
2017-12-01 11:21:28,792 sysinfo L0454 INFO | Profiler disabled
2017-12-01 11:21:28,891 sysinfo L0266 DEBUG| Journalctl collection failed: [Errno 2] No such file or directory (journalctl --quiet --lines 1 --output json)
2017-12-01 11:21:28,903 varianter L0116 DEBUG| PARAMS (key=timeout, path=*, default=None) => None
2017-12-01 11:21:28,904 test L0381 INFO | START 1-generic/nstress.py:NStress.test
2017-12-01 11:21:28,911 test L0402 DEBUG| Test metadata:
2017-12-01 11:21:28,914 test L0403 DEBUG| filename: /home/rc/avocado-misc-tests/generic/nstress.py
2017-12-01 11:21:29,075 varianter L0116 DEBUG| PARAMS (key=tar_ball_ubuntu, path=*, default=nstress_Ubuntu1410_ppc64_Nov_2015.tar) => 'nstress_Ubuntu1410_ppc64_Nov_2015.tar'
2017-12-01 11:21:29,191 varianter L0116 DEBUG| PARAMS (key=duration, path=*, default=300) => 300
2017-12-01 11:21:29,198 process L0389 INFO | Running '/usr/bin/sudo -n -s ./nmem -m 250 -s 300'
2017-12-01 11:21:29,453 process L0479 DEBUG| [stderr] /bin/bash: ./nmem: cannot execute binary file: Exec format error
2017-12-01 11:21:29,511 process L0499 INFO | Command '/usr/bin/sudo -n -s ./nmem -m 250 -s 300' finished with 126 after 0.274646043777s
2017-12-01 11:21:29,515 nstress L0031 INFO | ./nmem -m 250 -s 300 test failed
2017-12-01 11:21:29,522 process L0389 INFO | Running '/usr/bin/sudo -n -s ./nmem64 -m 2047 -s 300'
2017-12-01 11:21:29,768 process L0479 DEBUG| [stderr] /bin/bash: ./nmem64: cannot execute binary file: Exec format error
2017-12-01 11:21:29,826 process L0499 INFO | Command '/usr/bin/sudo -n -s ./nmem64 -m 2047 -s 300' finished with 126 after 0.267843961716s
2017-12-01 11:21:29,829 nstress L0031 INFO | ./nmem64 -m 2047 -s 300 test failed
2017-12-01 11:21:29,832 stacktrace L0041 ERROR|
2017-12-01 11:21:29,834 stacktrace L0044 ERROR| Reproduced traceback from: /usr/local/lib/python2.7/dist-packages/avocado/core/test.py:817
2017-12-01 11:21:29,844 stacktrace L0047 ERROR| Traceback (most recent call last):
2017-12-01 11:21:29,846 stacktrace L0047 ERROR| File "/home/rc/avocado-misc-tests/generic/nstress.py", line 54, in test
2017-12-01 11:21:29,848 stacktrace L0047 ERROR| self.fail("nstress test failed")
2017-12-01 11:21:29,851 stacktrace L0047 ERROR| File "/usr/local/lib/python2.7/dist-packages/avocado/core/test.py", line 983, in fail
2017-12-01 11:21:29,853 stacktrace L0047 ERROR| raise exceptions.TestFail(message)
2017-12-01 11:21:29,856 stacktrace L0047 ERROR| TestFail: nstress test failed
2017-12-01 11:21:29,858 stacktrace L0048 ERROR|
2017-12-01 11:21:29,861 test L0822 DEBUG| Local variables:
2017-12-01 11:21:30,213 test L0825 DEBUG| -> self <class 'nstress.NStress'>: 1-generic/nstress.py:NStress.test
2017-12-01 11:21:30,222 test L0278 DEBUG| DATA (filename=stdout.expected) => NOT FOUND (data sources: variant, test, file)
2017-12-01 11:21:30,227 test L0278 DEBUG| DATA (filename=stderr.expected) => NOT FOUND (data sources: variant, test, file)
2017-12-01 11:21:30,230 test L0965 ERROR| FAIL 1-generic/nstress.py:NStress.test -> TestFail: nstress test failed
2017-12-01 11:21:30,232 test L0954 INFO |
2017-12-01 11:21:30,953 sysinfo L0111 DEBUG| Not logging /proc/pci (file does not exist)
2017-12-01 11:21:31,026 sysinfo L0109 DEBUG| Not logging /proc/slabinfo (lack of permissions)
2017-12-01 11:21:31,183 sysinfo L0111 DEBUG| Not logging /sys/kernel/debug/sched_features (file does not exist)
2017-12-01 11:21:31,641 job L0478 INFO | Test results available in /home/rc/avocado/job-results/job-2017-12-01T11.21-37fe342
The test lshw.py:test_lshw_options
is specific to scsi
. For systems without scsi
it would be helpful to beeing able to disable this test to prevent from a false positive overall test status of FAIL
.
Test generic/lshw.py:Lshwrun.test_lshw_verification
fails in this check due to ifconfig
output not considered yet.
Output of ifconfig | head -1 | cut -d':' -f1
:
eth0 Link encap
Output of lshw -class network
:
WARNING: you should run this program as super-user.
*-network
...
logical name: eth0
...
...
WARNING: output may be incomplete or inaccurate, you should run this program as super-user.
On Ubuntu 14.04 the test status is CANCEL
due to TestCancel: Fail to install libtool-bin required for this test.
. The reason for that is that libtool-bin
has been replaced by libtool
.
Any tests which uses patch command should first make sure patch package is installed in sm.install(). example is ebizzy.py
I will send a patch for this
The test Bonnie.test
fails when executed on Ubuntu during setup where bonnie++ is built.
Is a package installation like on Ubuntu via sudo apt-get install bonnie++
possible on other currently supported platforms as well?
The README
states that a lot of tests are ported from autotest
. Some tests like /cpu/pmqa.py:Pmqa.test
seem to be ports from the Linaro Power Management test suite (Linaro test definitions). Does someone know about other relations like these. It would be helpful to have some kind of "mapping" to determine coverage and avoid duplication (in case several frameworks are in use).
Hi,
Multipath.py script is failing. Attached is the job.log file. I have tried to run the script with dependency code which is still not merged in utils, still no luck. I have discussed this with @Naresh-ibm , and showed him the failure, he will be looking into this.
If I run the IO GenWqe test suite with avocado run ~/avocado-misc-tests-master/io/genwqe
on a device which does not have an GenWQE accelerator the test status is CANCEL.
RESULTS : PASS 0 | ERROR 0 | FAIL 0 | SKIP 0 | WARN 0 | INTERRUPT 0 | CANCEL 9
JOB TIME : 13.16 s
In other cases the test status will be SKIP if the setup()
fails due to, in this case a configuration of the system under test unsuitable for the test cases.
How can I identify tests which SKIPed and CANCELed due to an unsuitable configuration of the system under test? Is there a way to "blacklist" these tests prior to test execution?
self.skip()
will be deprecated by the end of this year. Let's adjust the tests in this repo so we don't face any issues in the future.
In my system I don't want to allow some governors. If the corresponding tests run they are "skipped" by pm-qa
which results in an overall test status of pmqa.py
as FAIL
. Is it possible to configure the test cpufreq_05
that it does just execute the tests for specific governors?
When executing the test Aiostress.test
with sudo avocado run perf/aiostress.py:Aiostress.test
on Ubuntu 16.04.3 on an embedded linux system it fails due to CmdError: Command './aio-stress foo' failed (rc=-11)
with status ERROR
.
2017-12-05 08:07:10,766 test L0381 INFO | START 1-perf/aiostress.py:Aiostress.test
2017-12-05 08:07:10,771 test L0402 DEBUG| Test metadata:
2017-12-05 08:07:10,773 test L0403 DEBUG| filename: /home/rc/avocado-misc-tests/perf/aiostress.py
2017-12-05 08:07:11,066 software_manager L0692 DEBUG| apt-get version: 1.0.1ubuntu2
2017-12-05 08:07:11,073 process L0389 INFO | Running '/usr/bin/dpkg -s gdebi-core'
2017-12-05 08:07:11,797 process L0479 DEBUG| [stdout] Package: gdebi-core
2017-12-05 08:07:11,803 process L0499 INFO | Command '/usr/bin/dpkg -s gdebi-core' finished with 0 after 0.692743062973s
2017-12-05 08:07:11,805 process L0479 DEBUG| [stdout] Status: install ok installed
2017-12-05 08:07:11,811 process L0479 DEBUG| [stdout] Priority: optional
2017-12-05 08:07:11,815 process L0479 DEBUG| [stdout] Section: admin
2017-12-05 08:07:11,818 process L0479 DEBUG| [stdout] Installed-Size: 132
2017-12-05 08:07:11,824 process L0479 DEBUG| [stdout] Maintainer: Ubuntu Developers <[email protected]>
2017-12-05 08:07:11,827 process L0479 DEBUG| [stdout] Architecture: all
2017-12-05 08:07:11,831 process L0479 DEBUG| [stdout] Source: gdebi
2017-12-05 08:07:11,835 process L0479 DEBUG| [stdout] Version: 0.9.5.3ubuntu3
2017-12-05 08:07:11,838 process L0479 DEBUG| [stdout] Depends: python3:any (>= 3.3.2-2~), python3-apt, python3-debian, file
2017-12-05 08:07:11,842 process L0479 DEBUG| [stdout] Suggests: xz-utils | xz-lzma
2017-12-05 08:07:11,846 process L0479 DEBUG| [stdout] Description: simple tool to install deb files
2017-12-05 08:07:11,849 process L0479 DEBUG| [stdout] gdebi lets you install local deb packages resolving and installing
2017-12-05 08:07:11,853 process L0479 DEBUG| [stdout] its dependencies. apt does the same, but only for remote (http, ftp)
2017-12-05 08:07:11,857 process L0479 DEBUG| [stdout] located packages.
2017-12-05 08:07:11,861 process L0479 DEBUG| [stdout] .
2017-12-05 08:07:11,864 process L0479 DEBUG| [stdout] This package contains the libraries and command-line utility.
2017-12-05 08:07:11,868 process L0479 DEBUG| [stdout] Original-Maintainer: Ubuntu Developers <[email protected]>
2017-12-05 08:07:11,876 process L0389 INFO | Running '/usr/bin/dpkg -s libaio1'
2017-12-05 08:07:12,562 process L0479 DEBUG| [stdout] Package: libaio1
2017-12-05 08:07:12,569 process L0499 INFO | Command '/usr/bin/dpkg -s libaio1' finished with 0 after 0.657629013062s
2017-12-05 08:07:12,573 process L0479 DEBUG| [stdout] Status: install ok installed
2017-12-05 08:07:12,576 process L0479 DEBUG| [stdout] Priority: optional
2017-12-05 08:07:12,580 process L0479 DEBUG| [stdout] Section: libs
2017-12-05 08:07:12,584 process L0479 DEBUG| [stdout] Installed-Size: 52
2017-12-05 08:07:12,588 process L0479 DEBUG| [stdout] Maintainer: Ubuntu Developers <[email protected]>
2017-12-05 08:07:12,594 process L0479 DEBUG| [stdout] Architecture: armhf
2017-12-05 08:07:12,598 process L0479 DEBUG| [stdout] Multi-Arch: same
2017-12-05 08:07:12,602 process L0479 DEBUG| [stdout] Source: libaio
2017-12-05 08:07:12,605 process L0479 DEBUG| [stdout] Version: 0.3.109-4
2017-12-05 08:07:12,608 process L0479 DEBUG| [stdout] Pre-Depends: multiarch-support
2017-12-05 08:07:12,612 process L0479 DEBUG| [stdout] Description: Linux kernel AIO access library - shared library
2017-12-05 08:07:12,617 process L0479 DEBUG| [stdout] This library enables userspace to use Linux kernel asynchronous I/O
2017-12-05 08:07:12,622 process L0479 DEBUG| [stdout] system calls, important for the performance of databases and other
2017-12-05 08:07:12,625 process L0479 DEBUG| [stdout] advanced applications.
2017-12-05 08:07:12,628 process L0479 DEBUG| [stdout] Original-Maintainer: Guillem Jover <[email protected]>
2017-12-05 08:07:12,632 process L0479 DEBUG| [stdout] Homepage: http://www.kernel.org/pub/linux/libs/aio/
2017-12-05 08:07:12,642 process L0389 INFO | Running '/usr/bin/dpkg -s libaio-dev'
2017-12-05 08:07:13,360 process L0479 DEBUG| [stdout] Package: libaio-dev
2017-12-05 08:07:13,365 process L0479 DEBUG| [stdout] Status: install ok installed
2017-12-05 08:07:13,369 process L0479 DEBUG| [stdout] Priority: optional
2017-12-05 08:07:13,371 process L0499 INFO | Command '/usr/bin/dpkg -s libaio-dev' finished with 0 after 0.694006919861s
2017-12-05 08:07:13,377 process L0479 DEBUG| [stdout] Section: libdevel
2017-12-05 08:07:13,381 process L0479 DEBUG| [stdout] Installed-Size: 82
2017-12-05 08:07:13,384 process L0479 DEBUG| [stdout] Maintainer: Ubuntu Developers <[email protected]>
2017-12-05 08:07:13,389 process L0479 DEBUG| [stdout] Architecture: armhf
2017-12-05 08:07:13,392 process L0479 DEBUG| [stdout] Source: libaio
2017-12-05 08:07:13,396 process L0479 DEBUG| [stdout] Version: 0.3.109-4
2017-12-05 08:07:13,400 process L0479 DEBUG| [stdout] Depends: libaio1 (= 0.3.109-4)
2017-12-05 08:07:13,403 process L0479 DEBUG| [stdout] Description: Linux kernel AIO access library - development files
2017-12-05 08:07:13,407 process L0479 DEBUG| [stdout] This library enables userspace to use Linux kernel asynchronous I/O
2017-12-05 08:07:13,410 process L0479 DEBUG| [stdout] system calls, important for the performance of databases and other
2017-12-05 08:07:13,414 process L0479 DEBUG| [stdout] advanced applications.
2017-12-05 08:07:13,417 process L0479 DEBUG| [stdout] Original-Maintainer: Guillem Jover <[email protected]>
2017-12-05 08:07:13,421 process L0479 DEBUG| [stdout] Homepage: http://www.kernel.org/pub/linux/libs/aio/
2017-12-05 08:07:13,436 download L0066 INFO | Fetching https://oss.oracle.com/~mason/aio-stress/aio-stress.c -> /var/lib/avocado/data/cache/aio-stress.c.C0tx_A
2017-12-05 08:07:14,714 process L0389 INFO | Running 'gcc -Wall -o aio-stress /var/lib/avocado/data/cache/aio-stress.c -laio -lpthread'
2017-12-05 08:07:15,577 process L0479 DEBUG| [stderr] /var/lib/avocado/data/cache/aio-stress.c: In function ‘worker’:
2017-12-05 08:07:15,583 process L0479 DEBUG| [stderr] /var/lib/avocado/data/cache/aio-stress.c:1175:3: warning: format ‘%ld’ expects argument of type ‘long int’, but argument 3 has type ‘int’ [-Wformat=]
2017-12-05 08:07:15,587 process L0479 DEBUG| [stderr] t->stage_mb_trans, seconds);
2017-12-05 08:07:15,590 process L0479 DEBUG| [stderr] ^
2017-12-05 08:07:15,609 process L0479 DEBUG| [stderr] /var/lib/avocado/data/cache/aio-stress.c: In function ‘main’:
2017-12-05 08:07:15,614 process L0479 DEBUG| [stderr] /var/lib/avocado/data/cache/aio-stress.c:1455:10: warning: format ‘%lu’ expects argument of type ‘long unsigned int’, but argument 3 has type ‘off_t’ [-Wformat=]
2017-12-05 08:07:15,619 process L0479 DEBUG| [stderr] file_size, num_contexts);
2017-12-05 08:07:15,623 process L0479 DEBUG| [stderr] ^
2017-12-05 08:07:15,627 process L0479 DEBUG| [stderr] /var/lib/avocado/data/cache/aio-stress.c:1460:6: warning: format ‘%lu’ expects argument of type ‘long unsigned int’, but argument 3 has type ‘off_t’ [-Wformat=]
2017-12-05 08:07:15,630 process L0479 DEBUG| [stderr] file_size / (1024 * 1024), rec_len / 1024, depth, io_iter);
2017-12-05 08:07:15,634 process L0479 DEBUG| [stderr] ^
2017-12-05 08:07:15,639 process L0479 DEBUG| [stderr] /var/lib/avocado/data/cache/aio-stress.c:1465:6: warning: format ‘%lu’ expects argument of type ‘long unsigned int’, but argument 6 has type ‘off_t’ [-Wformat=]
2017-12-05 08:07:15,643 process L0479 DEBUG| [stderr] context_offset / (1024 * 1024), verify ? "on" : "off");
2017-12-05 08:07:15,647 process L0479 DEBUG| [stderr] ^
2017-12-05 08:07:18,582 process L0499 INFO | Command 'gcc -Wall -o aio-stress /var/lib/avocado/data/cache/aio-stress.c -laio -lpthread' finished with 0 after 3.82759809494s
2017-12-05 08:07:18,588 process L0389 INFO | Running './aio-stress foo'
2017-12-05 08:07:18,632 process L0479 DEBUG| [stderr] file size 1024MB, record size 0KB, depth 64, ios per iteration 64
2017-12-05 08:07:18,640 process L0479 DEBUG| [stderr] max io_submit 8, buffer alignment set to 4KB
2017-12-05 08:07:18,643 process L0499 INFO | Command './aio-stress foo' finished with -11 after 0.0169310569763s
2017-12-05 08:07:18,654 stacktrace L0041 ERROR|
2017-12-05 08:07:18,656 stacktrace L0044 ERROR| Reproduced traceback from: /usr/local/lib/python2.7/dist-packages/avocado/core/test.py:817
2017-12-05 08:07:18,666 stacktrace L0047 ERROR| Traceback (most recent call last):
2017-12-05 08:07:18,668 stacktrace L0047 ERROR| File "/home/rc/avocado-misc-tests/perf/aiostress.py", line 74, in test
2017-12-05 08:07:18,671 stacktrace L0047 ERROR| process.run(cmd)
2017-12-05 08:07:18,675 stacktrace L0047 ERROR| File "/usr/local/lib/python2.7/dist-packages/avocado/utils/process.py", line 1117, in run
2017-12-05 08:07:18,676 stacktrace L0047 ERROR| raise CmdError(cmd, sp.result)
2017-12-05 08:07:18,678 stacktrace L0047 ERROR| CmdError: Command './aio-stress foo' failed (rc=-11)
2017-12-05 08:07:18,680 stacktrace L0048 ERROR|
2017-12-05 08:07:18,682 test L0822 DEBUG| Local variables:
2017-12-05 08:07:19,027 test L0825 DEBUG| -> self <class 'aiostress.Aiostress'>: 1-perf/aiostress.py:Aiostress.test
2017-12-05 08:07:19,029 test L0825 DEBUG| -> cmd <type 'str'>: ./aio-stress foo
2017-12-05 08:07:19,035 test L0278 DEBUG| DATA (filename=stdout.expected) => NOT FOUND (data sources: variant, test, file)
2017-12-05 08:07:19,040 test L0278 DEBUG| DATA (filename=stderr.expected) => NOT FOUND (data sources: variant, test, file)
2017-12-05 08:07:19,044 test L0950 ERROR| Traceback (most recent call last):
2017-12-05 08:07:19,046 test L0950 ERROR| File "/usr/local/lib/python2.7/dist-packages/avocado/core/test.py", line 888, in _run_avocado
raise test_exception
2017-12-05 08:07:19,048 test L0950 ERROR| CmdError: Command './aio-stress foo' failed (rc=-11)
2017-12-05 08:07:19,050 test L0965 ERROR| ERROR 1-perf/aiostress.py:Aiostress.test -> CmdError: Command './aio-stress foo' failed (rc=-11)
If the test Compilebench.test
(located in ~/avocado-misc-tests-master/perf/compilebench.py
) is executed when running the performance tests with avocado run ~/avocado-misc-tests-master/perf
(I use avocado v55.0 from PyPi) it throws a RuntimeError. The following tests in the test suite are executed.
$ avocado run ~/avocado-misc-tests-master/perf
JOB ID : b03ec98d21a4f007645642e2abc04275c0627e0d
JOB LOG : /home/rc/avocado/job-results/job-2017-11-08T15.38-b03ec98/job.log
...
(03/16) /home/rc/avocado-misc-tests-master/perf/compilebench.py:Compilebench.test: -
Avocado crashed: RuntimeError: maximum recursion depth exceeded in cmp
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/avocado/core/job.py", line 474, in run_tests
execution_order)
File "/usr/local/lib/python2.7/dist-packages/avocado/core/runner.py", line 616, in run_suite
deadline):
File "/usr/local/lib/python2.7/dist-packages/avocado/core/runner.py", line 396, in run_test
proc.start()
File "/usr/lib/python2.7/multiprocessing/process.py", line 130, in start
self._popen = Popen(self)
File "/usr/lib/python2.7/multiprocessing/forking.py", line 126, in __init__
code = process_obj._bootstrap()
File "/usr/lib/python2.7/multiprocessing/process.py", line 274, in _bootstrap
sys.stderr.write('Process %s:\n' % self.name)
File "/usr/local/lib/python2.7/dist-packages/avocado/core/output.py", line 640, in write
self._log_line("%s\n" % data_lines[0])
File "/usr/local/lib/python2.7/dist-packages/avocado/core/output.py", line 651, in _log_line
logger.log(self._level, prefix + line)
File "/usr/lib/python2.7/logging/__init__.py", line 1216, in log
self._log(level, msg, args, **kwargs)
File "/usr/lib/python2.7/logging/__init__.py", line 1271, in _log
self.handle(record)
File "/usr/lib/python2.7/logging/__init__.py", line 1281, in handle
self.callHandlers(record)
File "/usr/lib/python2.7/logging/__init__.py", line 1321, in callHandlers
hdlr.handle(record)
File "/usr/lib/python2.7/logging/__init__.py", line 749, in handle
self.emit(record)
File "/usr/lib/python2.7/logging/__init__.py", line 942, in emit
StreamHandler.emit(self, record)
File "/usr/lib/python2.7/logging/__init__.py", line 879, in emit
self.handleError(record)
File "/usr/lib/python2.7/logging/__init__.py", line 802, in handleError
None, sys.stderr)
File "/usr/lib/python2.7/traceback.py", line 124, in print_exception
_print(file, 'Traceback (most recent call last):')
File "/usr/lib/python2.7/traceback.py", line 13, in _print
file.write(str+terminator)
... (traceback between line 474, in run_tests and line 13, in _print repeated many times)
File "/usr/local/lib/python2.7/dist-packages/avocado/core/output.py", line 640, in write
self._log_line("%s\n" % data_lines[0])
File "/usr/local/lib/python2.7/dist-packages/avocado/core/output.py", line 651, in _log_line
logger.log(self._level, prefix + line)
File "/usr/lib/python2.7/logging/__init__.py", line 1216, in log
self._log(level, msg, args, **kwargs)
File "/usr/lib/python2.7/logging/__init__.py", line 1270, in _log
record = self.makeRecord(self.name, level, fn, lno, msg, args, exc_info, func, extra)
File "/usr/lib/python2.7/logging/__init__.py", line 1244, in makeRecord
rv = LogRecord(name, level, fn, lno, msg, args, exc_info, func)
File "/usr/lib/python2.7/logging/__init__.py", line 271, in __init__
self.module = os.path.splitext(self.filename)[0]
File "/usr/lib/python2.7/posixpath.py", line 105, in splitext
return genericpath._splitext(p, sep, altsep, extsep)
File "/usr/lib/python2.7/genericpath.py", line 101, in _splitext
if p[filenameIndex] != extsep:
RuntimeError: maximum recursion depth exceeded in cmp
Please include the traceback info and command line used on your bug report
Report bugs visiting https://github.com/avocado-framework/avocado/issues/new
Error running method "render" of plugin "xunit": [Errno 28] No space left on device
ERROR (770.18 s)
(04/16) /home/rc/avocado-misc-tests-master/perf/hackbench.py:Hackbench.test:
generic/error_cleanup.py leads to a "hard coded" test status of ERROR
. Means it is unsuitable to be executed in test suites. Is it a test for avocados
teardown functionality? Is it suitable to move the test or inform users about that it should not be used in test suites?
Older hardware do not support options above SMT= 4
Power6/Power7
[root@ltcp6 ~]# ppc64_cpu --smt=5
SMT=5 is not valid
[root@ltcp6 ~]# echo $?
255
Power8
root@tuleta4u-lp2:# ppc64_cpu --smt=5# echo $?
root@tuleta4u-lp2:
0
a check for hardware could be added before the command.
Thanks
There is a problem with the checking of self.name with the test case name, So script is not skipping the test cases/iteration which are not applicable. I have fixed that and will submit a patch.
If the IO/disk tests are run on an embedded linux device with avocado run ~/avocado-misc-tests-master/io/disk
the test run aborts during execution of FioTest.test which leads to a crash of Avocado
:
(06/75) /home/rc/avocado-misc-tests-master/io/disk/fiotest.py:FioTest.test: -Unhandled exception in thread started by
\Error in sys.excepthook:
Original exception was:
-Unhandled exception in thread started by-Avocado crashed unexpectedly: maximum recursion depth exceeded
You can find details in /var/tmp/avocado-traceback-2017-11-09_09:13:55-uPvpy8.log
ERROR (1015.91 s)
Avocado crashed: TestError: Process died before it pushed early test_status.
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/avocado/core/job.py", line 474, in run_tests
execution_order)
File "/usr/local/lib/python2.7/dist-packages/avocado/core/runner.py", line 616, in run_suite
deadline):
File "/usr/local/lib/python2.7/dist-packages/avocado/core/runner.py", line 398, in run_test
test_status.wait_for_early_status(proc, 60)
File "/usr/local/lib/python2.7/dist-packages/avocado/core/runner.py", line 157, in wait_for_early_status
raise exceptions.TestError("Process died before it pushed "
TestError: Process died before it pushed early test_status.
Please include the traceback info and command line used on your bug report
Report bugs visiting https://github.com/avocado-framework/avocado/issues/new
The relevant log file content:
(06/75) /home/rc/avocado-misc-tests-master/io/disk/fiotest.py:FioTest.test: -Unhandled exception in thread started by
\Error in sys.excepthook:
Original exception was:
-Unhandled exception in thread started by-Avocado crashed unexpectedly: maximum recursion depth exceeded
You can find details in /var/tmp/avocado-traceback-2017-11-09_09:13:55-uPvpy8.log
ERROR (1015.91 s)
Avocado crashed: TestError: Process died before it pushed early test_status.
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/avocado/core/job.py", line 474, in run_tests
execution_order)
File "/usr/local/lib/python2.7/dist-packages/avocado/core/runner.py", line 616, in run_suite
deadline):
File "/usr/local/lib/python2.7/dist-packages/avocado/core/runner.py", line 398, in run_test
test_status.wait_for_early_status(proc, 60)
File "/usr/local/lib/python2.7/dist-packages/avocado/core/runner.py", line 157, in wait_for_early_status
raise exceptions.TestError("Process died before it pushed "
TestError: Process died before it pushed early test_status.
Please include the traceback info and command line used on your bug report
Report bugs visiting https://github.com/avocado-framework/avocado/issues/new
rc@rc-visard-02911982:~$ cat /var/tmp/avocado-traceback-2017-11-09_09:13:55-uPvpy8.log
Avocado crashed:
Traceback (most recent call last):
File "/usr/lib/python2.7/threading.py", line 783, in __bootstrap
self.__bootstrap_inner()
File "/usr/lib/python2.7/threading.py", line 823, in __bootstrap_inner
(self.name, _format_exc()))
File "/usr/local/lib/python2.7/dist-packages/avocado/core/output.py", line 640, in write
self._log_line("%s\n" % data_lines[0])
File "/usr/local/lib/python2.7/dist-packages/avocado/core/output.py", line 651, in _log_line
logger.log(self._level, prefix + line)
File "/usr/lib/python2.7/logging/__init__.py", line 1216, in log
self._log(level, msg, args, **kwargs)
File "/usr/lib/python2.7/logging/__init__.py", line 1271, in _log
self.handle(record)
File "/usr/lib/python2.7/logging/__init__.py", line 1281, in handle
self.callHandlers(record)
File "/usr/lib/python2.7/logging/__init__.py", line 1321, in callHandlers
hdlr.handle(record)
File "/usr/lib/python2.7/logging/__init__.py", line 749, in handle
self.emit(record)
File "/usr/lib/python2.7/logging/__init__.py", line 942, in emit
StreamHandler.emit(self, record)
File "/usr/lib/python2.7/logging/__init__.py", line 879, in emit
self.handleError(record)
File "/usr/lib/python2.7/logging/__init__.py", line 802, in handleError
None, sys.stderr)
File "/usr/lib/python2.7/traceback.py", line 124, in print_exception
_print(file, 'Traceback (most recent call last):')
File "/usr/lib/python2.7/traceback.py", line 13, in _print
(traceback repeated between "line 640" and "line 13" statements a lot of times)
File "/usr/local/lib/python2.7/dist-packages/avocado/core/output.py", line 640, in write
self._log_line("%s\n" % data_lines[0])
File "/usr/local/lib/python2.7/dist-packages/avocado/core/output.py", line 651, in _log_line
logger.log(self._level, prefix + line)
File "/usr/lib/python2.7/logging/__init__.py", line 1216, in log
self._log(level, msg, args, **kwargs)
File "/usr/lib/python2.7/logging/__init__.py", line 1262, in _log
fn, lno, func = self.findCaller()
File "/usr/lib/python2.7/logging/__init__.py", line 1223, in findCaller
f = currentframe()
RuntimeError: maximum recursion depth exceeded
Could it be required to adjust the "maximum recursion depth" when the test is run on an embedded linux device.
The testscript is hardcoded as 100 GB as the default disk size. test works for all disks <= 100GB
if gigabytes is None:
free = 100 # cap it at 100GB by default
for disk in self.disks:
free = min(utils_disk.freespace(disk) / 1073741824, free)
gigabytes = free
the above calculation fails if the disk is larger size like 1TB it fails with this error
"Free disk space is lower than chunk size (10240, 261409)"
Relevant part of a test execution:
2017-05-12 19:43:27,023 process L0389 INFO | Running 'make -C cpufreq run_tests'
2017-05-12 19:43:27,028 process L0479 DEBUG| [stdout] make: Entering directory '/var/tmp/avocado_rjsrAE/1-cpu_pmqa.py_Pmqa.test/src/cpufreq'
2017-05-12 19:43:27,037 process L0479 DEBUG| [stdout] ./cpufreq_sanity.sh
2017-05-12 19:43:27,045 process L0479 DEBUG| [stdout] cpufreq_sanity.0: user is not root... skip
2017-05-12 19:43:27,045 process L0479 DEBUG| [stdout] make: Leaving directory '/var/tmp/avocado_rjsrAE/1-cpu_pmqa.py_Pmqa.test/src/cpufreq'
2017-05-12 19:43:27,045 process L0499 INFO | Command 'make -C cpufreq run_tests' finished with 0 after 0.0192301273346s
2017-05-12 19:43:27,045 process L0389 INFO | Running 'grep -wF 'fail' /home/cleber/avocado/job-results/job-2017-05-12T19.43-1e06a7a/test-results/1-cpu_pmqa.py:Pmqa.test/stdout'
2017-05-12 19:43:27,050 process L0499 INFO | Command 'grep -wF 'fail' /home/cleber/avocado/job-results/job-2017-05-12T19.43-1e06a7a/test-results/1-cpu_pmqa.py:Pmqa.test/stdout' finished with 1 after 0.000494003295898s
2017-05-12 19:43:27,050 sysinfo L0366 DEBUG| Not logging /var/log/messages (lack of permissions)
2017-05-12 19:43:27,058 test L0750 INFO | PASS 1-cpu/pmqa.py:Pmqa.test
The installation of lsscsi
in the setup of the arcconf
tests like io/disk/arcconf/arcconf_cntl_oper.py is not executed when the test is executed on Ubuntu 14.04.5 LTS.
44-./io/disk/arcconf/arcconf_cntl_oper.py:Arcconftest.test -> TestCancel: Unable to install lsscsi
45-./io/disk/arcconf/arcconf_drive_oper.py:Arcconftest.test -> TestCancel: Unable to install lsscsi
46-./io/disk/arcconf/arcconf_migration.py:Arcconftest.test -> TestCancel: Unable to install lsscsi
47-./io/disk/arcconf/arcconf_raid_oper.py:Arcconftest.test -> TestCancel: Unable to install lsscsi
I needed to manually install it with sudo apt-get install lsscsi
.
If DmaMemtest.test
is run with Avocado v55.0 on a memory constraint embedded linux device Avocado crashes due to a RuntimeError
. However avocado "recovers from the crash" and continues the test suite execution. The test could be skipped for embedded devices like described in this related issue.
$ avocado run ~/avocado-misc-tests-master/memory/
JOB ID : baf4c4fa2a346e2eaae33d5c8e4b25074b0a84ad
JOB LOG : /home/rc/avocado/job-results/job-2017-11-09T17.19-baf4c4f/job.log
(01/11) /home/rc/avocado-misc-tests-master/memory/dma_memtest.py:DmaMemtest.test: \
Avocado crashed: RuntimeError: maximum recursion depth exceeded in cmp
The related source code: memory/dma_memtest.py class DmaMemtest(Test)
Problem Statement :
through service_check.py Test we can only verify services which is listed in cfg file (services.cfg) can this test is going to extend as able to check all OS service available in system
solution can be
query all running services which is available in os and perform same operation what is current flow
Test :
https://github.com/avocado-framework-tests/avocado-misc-tests/blob/master/generic/service_check.py
Softwareraid with three block device mentioned in the yaml file does pass RAID 5 scenario with fail-over test .
=====console log=====
linux-0a5l:/avocado-misc-tests/io/disk # avocado run softwareraid.py -m softwareraid.py.data/softwareraid.yaml/avocado-misc-tests/io/disk # lsblk
JOB ID : d84adcedbc346c4f86965e06a8159e796c931da1
JOB LOG : /root/avocado/job-results/job-2017-05-18T05.23-d84adce/job.log
(1/6) softwareraid.py:SoftwareRaid.test_run;raidlinear-scenario-4a19: PASS (0.38 s)
(2/6) softwareraid.py:SoftwareRaid.test_run;raid0-scenario-716d: PASS (0.41 s)
(3/6) softwareraid.py:SoftwareRaid.test_run;raid1-scenario-a5ca: PASS (3545.16 s)
(4/6) softwareraid.py:SoftwareRaid.test_run;raid5-scenario-f871: PASS (1896.36 s)
(5/6) softwareraid.py:SoftwareRaid.test_run;raid10-scenario-3425: PASS (1927.12 s)
(6/6) softwareraid.py:SoftwareRaid.test_run;raid6-scenario-110f: FAIL (0.10 s)
RESULTS : PASS 5 | ERROR 0 | FAIL 1 | SKIP 0 | WARN 0 | INTERRUPT 0 | CANCEL 0
TESTS TIME : 7369.54 s
You have new mail in /var/mail/root
linux-0a5l:
NAME MAJ:MIN RM SIZE RO TYPE MOUNTPOINT
sda 8:0 0 35G 0 disk
├─sda1 8:1 0 7M 0 part
├─sda2 8:2 0 2G 0 part
├─sda3 8:3 0 13.8G 0 part
├─sda4 8:4 0 19.2G 0 part
└─SAIX_VDASD_00fa20e000004c00000001545398a86b.13 254:0 0 35G 0 mpath
├─SAIX_VDASD_00fa20e000004c00000001545398a86b.13-part1 254:1 0 7M 0 part
├─SAIX_VDASD_00fa20e000004c00000001545398a86b.13-part2 254:2 0 2G 0 part [SWAP]
├─SAIX_VDASD_00fa20e000004c00000001545398a86b.13-part3 254:3 0 13.8G 0 part /opt
└─SAIX_VDASD_00fa20e000004c00000001545398a86b.13-part4 254:4 0 19.2G 0 part /home
sdb 8:16 0 361.3G 0 disk
├─sdb1 8:17 0 361.3G 0 part
└─1IBM_IPR-0_59D6110000003AD0 254:5 0 361.3G 0 mpath
└─1IBM_IPR-0_59D6110000003AD0-part1 254:7 0 361.3G 0 part
sdc 8:32 0 361.3G 0 disk
└─1IBM_IPR-0_59D6110000003AF0 254:6 0 361.3G 0 mpath
sdd 8:48 0 361.3G 0 disk
├─sdd1 8:49 0 361.3G 0 part
└─1IBM_IPR-0_59D6110000003AD0 254:5 0 361.3G 0 mpath
└─1IBM_IPR-0_59D6110000003AD0-part1 254:7 0 361.3G 0 part
sde 8:64 0 361.3G 0 disk
└─1IBM_IPR-0_59D6110000003AF0 254:6 0 361.3G 0 mpath
sdf 8:80 0 361.3G 0 disk
└─1IBM_IPR-0_59D6110000004150 254:8 0 361.3G 0 mpath
sdg 8:96 0 361.3G 0 disk
└─1IBM_IPR-0_59D6110000004150 254:8 0 361.3G 0 mpath
sdh 8:112 0 361.3G 0 disk
└─1IBM_IPR-0_59D6110000004170 254:9 0 361.3G 0 mpath
sdi 8:128 0 361.3G 0 disk
└─1IBM_IPR-0_59D6110000004170 254:9 0 361.3G 0 mpath
sdj 8:144 0 361.3G 0 disk
└─1IBM_IPR-0_59D6110000004190 254:10 0 361.3G 0 mpath
sdk 8:160 0 361.3G 0 disk
└─1IBM_IPR-0_59D6110000004190 254:10 0 361.3G 0 mpath
sr0 11:0 1 1024M 0 rom
sr1 11:1 1 1024M 0 rom
sr2 11:2 1 1024M 0 rom
linux-0a5l:~/avocado-misc-tests/io/disk # cat softwareraid.py.data/softwareraid.yaml
scenario:
disk: /dev/dm-10 /dev/dm-9 /dev/dm-8
raidlevel: !mux
raidlinear:
raid: linear
raid0:
raid: 0
raid1:
raid: 1
raid5:
raid: 5
raid10:
raid: 10
raid6:
raid: 6
When the test generic/packages.py:Package_check.test
is executed on system with ARMv7 cpu it fails. Seems like it is specific to powerpc (job_generic_repo_tests.log
is the log file of the whole test suite generic
):
$ cat job_generic_repo_tests.log | grep -B 34 "| FAIL 21"
2017-11-09 18:09:10,328 test L0381 INFO | START 21-/home/rc/avocado-misc-tests-master/generic/packages.py:Package_check.test
2017-11-09 18:09:10,329 test L0402 DEBUG| Test metadata:
2017-11-09 18:09:10,329 test L0403 DEBUG| filename: /home/rc/avocado-misc-tests-master/generic/packages.py
2017-11-09 18:09:10,351 varianter L0116 DEBUG| PARAMS (key=packages, path=*, default=['powerpc-utils', 'ppc64-diag', 'lsvpd']) => ['powerpc-utils', 'ppc64-diag', 'lsvpd']
2017-11-09 18:09:10,353 varianter L0116 DEBUG| PARAMS (key=packages_ubuntu, path=*, default=['librtas2']) => ['librtas2']
2017-11-09 18:09:10,353 process L0389 INFO | Running 'apt-mark showauto powerpc-utils'
2017-11-09 18:09:10,853 process L0499 INFO | Command 'apt-mark showauto powerpc-utils' finished with 0 after 0.495363950729s
2017-11-09 18:09:10,854 packages L0059 INFO | powerpc-utils package is not installed by default
2017-11-09 18:09:10,854 process L0389 INFO | Running 'apt-mark showauto ppc64-diag'
2017-11-09 18:09:11,366 process L0499 INFO | Command 'apt-mark showauto ppc64-diag' finished with 0 after 0.507091999054s
2017-11-09 18:09:11,367 packages L0059 INFO | ppc64-diag package is not installed by default
2017-11-09 18:09:11,367 process L0389 INFO | Running 'apt-mark showauto lsvpd'
2017-11-09 18:09:11,867 process L0499 INFO | Command 'apt-mark showauto lsvpd' finished with 0 after 0.495292901993s
2017-11-09 18:09:11,867 packages L0059 INFO | lsvpd package is not installed by default
2017-11-09 18:09:11,868 process L0389 INFO | Running 'apt-mark showauto librtas2'
2017-11-09 18:09:12,369 process L0499 INFO | Command 'apt-mark showauto librtas2' finished with 0 after 0.49556684494s
2017-11-09 18:09:12,370 packages L0059 INFO | librtas2 package is not installed by default
2017-11-09 18:09:12,370 stacktrace L0041 ERROR|
2017-11-09 18:09:12,370 stacktrace L0044 ERROR| Reproduced traceback from: /usr/local/lib/python2.7/dist-packages/avocado/core/test.py:817
2017-11-09 18:09:12,371 stacktrace L0047 ERROR| Traceback (most recent call last):
2017-11-09 18:09:12,372 stacktrace L0047 ERROR| File "/home/rc/avocado-misc-tests-master/generic/packages.py", line 68, in test
2017-11-09 18:09:12,372 stacktrace L0047 ERROR| self.fail("%s package(s) not installed by default" % is_fail)
2017-11-09 18:09:12,372 stacktrace L0047 ERROR| File "/usr/local/lib/python2.7/dist-packages/avocado/core/test.py", line 983, in fail
2017-11-09 18:09:12,372 stacktrace L0047 ERROR| raise exceptions.TestFail(message)
2017-11-09 18:09:12,372 stacktrace L0047 ERROR| TestFail: 4 package(s) not installed by default
2017-11-09 18:09:12,373 stacktrace L0048 ERROR|
2017-11-09 18:09:12,373 test L0822 DEBUG| Local variables:
2017-11-09 18:09:12,408 test L0825 DEBUG| -> packages_ubuntu <type 'list'>: ['librtas2']
2017-11-09 18:09:12,409 test L0825 DEBUG| -> is_fail <type 'int'>: 4
2017-11-09 18:09:12,409 test L0825 DEBUG| -> self <class 'packages.Package_check'>: 21-/home/rc/avocado-misc-tests-master/generic/packages.py:Package_check.test
2017-11-09 18:09:12,409 test L0825 DEBUG| -> dist <class 'avocado.utils.distro.LinuxDistro'>: <LinuxDistro: name=Ubuntu, version=14, release=04, arch=armv7l>
2017-11-09 18:09:12,409 test L0825 DEBUG| -> package <type 'str'>: librtas2
2017-11-09 18:09:12,410 test L0278 DEBUG| DATA (filename=stdout.expected) => NOT FOUND (data sources: variant, test, file)
2017-11-09 18:09:12,410 test L0278 DEBUG| DATA (filename=stderr.expected) => NOT FOUND (data sources: variant, test, file)
2017-11-09 18:09:12,411 test L0965 ERROR| FAIL 21-/home/rc/avocado-misc-tests-master/generic/packages.py:Package_check.test -> TestFail: 4 package(s) not installed by default
I want to define a custom set of tests which shall be run as part of a build job (e.g. "smoke test"). I would like to select, configure and execute them. What possibilities do I have
As far as I know the configuration of tests happens in test specific .yaml
files. It is possible to include several .yaml
configurations into a single file, see #729 (comment). But w.r.t. to maintenance I am not sure if it is a really good option. (One limitation is that the execution order of the tests may vary.)
The tests may be executed on a directory basis (test suite) e.g. sudo avocado run io/net/
which executes all tests of this "suite". Tests can be executed on a file basis (usually several tests per file). It is possible to execute single test classes specifically with e.g. sudo avocado run generic/lshw.py:Lshwrun
or single tests with sudo avocado run generic/lshw.py:Lshwrun.test_lshw_verification
.
Right now the information about which tests shall be executed is stored in the command line invocation of avocado
e.g. avocado run io/net/ sleeptest.py generic/lshw.py:Lshwrun generic/lshw.py:Lshwrun.test_lshw_verification
. Are there alternatives to this approach?
If I run the IO driver test suite with avocado run ~/avocado-misc-tests-master/io/driver (I downloaded and unzipped the test repository into ~/avocado-misc-tests-master) the execution of test 4 leads to an error:
2017-11-09 10:46:58,746 test L0965 ERROR| ERROR 4-/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/driver_bind_test.yaml -> OSError: [Errno 8] Exec format error (/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/driver_bind_test.yaml)
Seems to related to #691.
The relevant log file content:
2017-11-09 10:46:57,487 test L0381 INFO | START 4-/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/driver_bind_test.yaml
2017-11-09 10:46:57,494 test L0402 DEBUG| Test metadata:
2017-11-09 10:46:57,510 test L0403 DEBUG| filename: /home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/driver_bind_test.yaml
2017-11-09 10:46:57,623 process L0389 INFO | Running '/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/driver_bind_test.yaml'
2017-11-09 10:46:57,766 stacktrace L0041 ERROR|
2017-11-09 10:46:57,770 stacktrace L0044 ERROR| Reproduced traceback from: /usr/local/lib/python2.7/dist-packages/avocado/core/test.py:817
2017-11-09 10:46:57,802 stacktrace L0047 ERROR| Traceback (most recent call last):
2017-11-09 10:46:57,809 stacktrace L0047 ERROR| File "/usr/local/lib/python2.7/dist-packages/avocado/core/test.py", line 1113, in test
2017-11-09 10:46:57,815 stacktrace L0047 ERROR| self._execute_cmd()
2017-11-09 10:46:57,821 stacktrace L0047 ERROR| File "/usr/local/lib/python2.7/dist-packages/avocado/core/test.py", line 1097, in _execute_cmd
2017-11-09 10:46:57,824 stacktrace L0047 ERROR| env=test_params)
2017-11-09 10:46:57,838 stacktrace L0047 ERROR| File "/usr/local/lib/python2.7/dist-packages/avocado/utils/process.py", line 1114, in run
2017-11-09 10:46:57,840 stacktrace L0047 ERROR| cmd_result = sp.run(timeout=timeout)
2017-11-09 10:46:57,852 stacktrace L0047 ERROR| File "/usr/local/lib/python2.7/dist-packages/avocado/utils/process.py", line 638, in run
2017-11-09 10:46:57,854 stacktrace L0047 ERROR| self._init_subprocess()
2017-11-09 10:46:57,856 stacktrace L0047 ERROR| File "/usr/local/lib/python2.7/dist-packages/avocado/utils/process.py", line 402, in _init_subprocess
2017-11-09 10:46:57,859 stacktrace L0047 ERROR| raise details
2017-11-09 10:46:57,875 stacktrace L0047 ERROR| OSError: [Errno 8] Exec format error (/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/driver_bind_test.yaml)
2017-11-09 10:46:57,877 stacktrace L0048 ERROR|
2017-11-09 10:46:57,880 test L0822 DEBUG| Local variables:
2017-11-09 10:46:58,700 test L0825 DEBUG| -> self <class 'avocado.core.test.SimpleTest'>: 4-/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/driver_bind_test.yaml
2017-11-09 10:46:58,707 test L0278 DEBUG| DATA (filename=stdout.expected) => NOT FOUND (data sources: variant, file)
2017-11-09 10:46:58,725 test L0278 DEBUG| DATA (filename=stderr.expected) => NOT FOUND (data sources: variant, file)
2017-11-09 10:46:58,729 test L0950 ERROR| Traceback (most recent call last):
2017-11-09 10:46:58,731 test L0950 ERROR| File "/usr/local/lib/python2.7/dist-packages/avocado/core/test.py", line 888, in _run_avocado
raise test_exception
2017-11-09 10:46:58,744 test L0950 ERROR| OSError: [Errno 8] Exec format error (/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/driver_bind_test.yaml)
2017-11-09 10:46:58,746 test L0965 ERROR| ERROR 4-/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/driver_bind_test.yaml -> OSError: [Errno 8] Exec format error (/home/rc/avocado-misc-tests-master/io/driver/driver_bind_test.py.data/driver_bind_test.yaml)
2017-11-09 10:46:58,748 test L0954 INFO |
2017-11-09 10:47:00,096 sysinfo L0111 DEBUG| Not logging /proc/pci (file does not exist)
2017-11-09 10:47:00,231 sysinfo L0109 DEBUG| Not logging /proc/slabinfo (lack of permissions)
2017-11-09 10:47:00,512 sysinfo L0111 DEBUG| Not logging /sys/kernel/debug/sched_features (file does not exist)
2017-11-09 10:47:01,573 job L0478 INFO | Test results available in /home/rc/avocado/job-results/job-2017-11-09T10.46-adb194e
perf/stress.py
generic/intrebench
These testcases creates a huge stress on the machine, making the test hung and test never completes.
report is not being generated.
Handle it in testcase to kill the pid for never ending testcases.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.