Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test repository test Compilebench.test throws RuntimeError #2286

Closed
fkromer opened this issue Nov 8, 2017 · 9 comments
Closed

Test repository test Compilebench.test throws RuntimeError #2286

fkromer opened this issue Nov 8, 2017 · 9 comments

Comments

@fkromer
Copy link

fkromer commented Nov 8, 2017

If the test Compilebench.test (located in ~/avocado-misc-tests-master/perf/compilebench.py) is executed when running the performance tests with avocado run ~/avocado-misc-tests-master/perf (I use avocado v55.0 from PyPi) it throws a RuntimeError. The following tests in the test suite are executed.

$ avocado run ~/avocado-misc-tests-master/perf
JOB ID     : b03ec98d21a4f007645642e2abc04275c0627e0d
JOB LOG    : /home/rc/avocado/job-results/job-2017-11-08T15.38-b03ec98/job.log
 ...
 (03/16) /home/rc/avocado-misc-tests-master/perf/compilebench.py:Compilebench.test: -
Avocado crashed: RuntimeError: maximum recursion depth exceeded in cmp
Traceback (most recent call last):

  File "/usr/local/lib/python2.7/dist-packages/avocado/core/job.py", line 474, in run_tests
    execution_order)

  File "/usr/local/lib/python2.7/dist-packages/avocado/core/runner.py", line 616, in run_suite
    deadline):

  File "/usr/local/lib/python2.7/dist-packages/avocado/core/runner.py", line 396, in run_test
    proc.start()

  File "/usr/lib/python2.7/multiprocessing/process.py", line 130, in start
    self._popen = Popen(self)

  File "/usr/lib/python2.7/multiprocessing/forking.py", line 126, in __init__
    code = process_obj._bootstrap()

  File "/usr/lib/python2.7/multiprocessing/process.py", line 274, in _bootstrap
    sys.stderr.write('Process %s:\n' % self.name)

  File "/usr/local/lib/python2.7/dist-packages/avocado/core/output.py", line 640, in write
    self._log_line("%s\n" % data_lines[0])

  File "/usr/local/lib/python2.7/dist-packages/avocado/core/output.py", line 651, in _log_line
    logger.log(self._level, prefix + line)

  File "/usr/lib/python2.7/logging/__init__.py", line 1216, in log
    self._log(level, msg, args, **kwargs)

  File "/usr/lib/python2.7/logging/__init__.py", line 1271, in _log
    self.handle(record)

  File "/usr/lib/python2.7/logging/__init__.py", line 1281, in handle
    self.callHandlers(record)

  File "/usr/lib/python2.7/logging/__init__.py", line 1321, in callHandlers
    hdlr.handle(record)

  File "/usr/lib/python2.7/logging/__init__.py", line 749, in handle
    self.emit(record)

  File "/usr/lib/python2.7/logging/__init__.py", line 942, in emit
    StreamHandler.emit(self, record)

  File "/usr/lib/python2.7/logging/__init__.py", line 879, in emit
    self.handleError(record)

  File "/usr/lib/python2.7/logging/__init__.py", line 802, in handleError
    None, sys.stderr)

  File "/usr/lib/python2.7/traceback.py", line 124, in print_exception
    _print(file, 'Traceback (most recent call last):')

  File "/usr/lib/python2.7/traceback.py", line 13, in _print
    file.write(str+terminator)

... (traceback between line 474, in run_tests and line 13, in _print repeated many times)

  File "/usr/local/lib/python2.7/dist-packages/avocado/core/output.py", line 640, in write
    self._log_line("%s\n" % data_lines[0])

  File "/usr/local/lib/python2.7/dist-packages/avocado/core/output.py", line 651, in _log_line
    logger.log(self._level, prefix + line)

  File "/usr/lib/python2.7/logging/__init__.py", line 1216, in log
    self._log(level, msg, args, **kwargs)

  File "/usr/lib/python2.7/logging/__init__.py", line 1270, in _log
    record = self.makeRecord(self.name, level, fn, lno, msg, args, exc_info, func, extra)

  File "/usr/lib/python2.7/logging/__init__.py", line 1244, in makeRecord
    rv = LogRecord(name, level, fn, lno, msg, args, exc_info, func)

  File "/usr/lib/python2.7/logging/__init__.py", line 271, in __init__
    self.module = os.path.splitext(self.filename)[0]

  File "/usr/lib/python2.7/posixpath.py", line 105, in splitext
    return genericpath._splitext(p, sep, altsep, extsep)

  File "/usr/lib/python2.7/genericpath.py", line 101, in _splitext
    if p[filenameIndex] != extsep:

RuntimeError: maximum recursion depth exceeded in cmp

Please include the traceback info and command line used on your bug report
Report bugs visiting https://github.com/avocado-framework/avocado/issues/new
Error running method "render" of plugin "xunit": [Errno 28] No space left on device
ERROR (770.18 s)
 (04/16) /home/rc/avocado-misc-tests-master/perf/hackbench.py:Hackbench.test:
@apahim
Copy link
Contributor

apahim commented Nov 8, 2017

That test is "community maintained". Please report the issue in their GitHub page. But wait, did you see the "[Errno 28] No space left on device" message? The test PASSed on my machine.

@fkromer
Copy link
Author

fkromer commented Nov 8, 2017

Yes, the log file contains L0479 DEBUG| [stderr] IOError: [Errno 28] No space left on device.

@apahim
Copy link
Contributor

apahim commented Nov 8, 2017

I'm pointing that. Maybe you have to check your system?

@fkromer
Copy link
Author

fkromer commented Nov 8, 2017

:) Do you know how much free space is required on the device? Dependent on that the test could be unsuitable anyway... BTW: Is there a quick way to investigate the documentation of the tests?

@apahim
Copy link
Contributor

apahim commented Nov 8, 2017

Do you know how much free space is required on the device?

Sorry, I don't.

Is there a quick way to investigate the documentation of the tests?

There's no such documentation. But just blindly run all the tests from that generic/public test repository makes no much sense for me. They are pretty much intended to serve as reference, so you can customize them for your needs or use them as examples to create your own tests. Ok, if you see something to improve, you should send a Pull Request there, but I doubt they will simply PASS out-of-the-box on every single system out there.

@fkromer
Copy link
Author

fkromer commented Nov 9, 2017

There's no such documentation.

Having a look into the source code is totally fine for me. (Python is usually self-documenting enough to figure out what the code does without having comments or docstrings.)

But just blindly run all the tests from that generic/public test repository makes no much sense for me.
... but I doubt they will simply PASS out-of-the-box on every single system out there.

You are right. But as I run the tests on an embedded device which runs Ubuntu I was not sure about the relevance/suitability of all tests. Running all tests and analysing potentially cancelling/failing tests seemed reasonable to me. (E.g. the memory on embedded linux devices is a lot more limited than on server machines. Means either I would have to configure the max. memory "consumption" of the test somehow or I have to skip it.)

@apahim
Copy link
Contributor

apahim commented Nov 9, 2017

Ok, cool. For the tests you want to skip, you can decorate them with a skip decorator: http://avocado-framework.readthedocs.io/en/55.0/WritingTests.html#avocado-skip-decorators.

If you do that conditionally, i.e. for embedded systems, you can send a pull request with your changes so next person running those tests on embedded devices will have them skipped as well.

Here an example: https://github.com/avocado-framework-tests/avocado-misc-tests/blob/master/generic/ras_extended.py#L50

@fkromer
Copy link
Author

fkromer commented Nov 9, 2017

If you do that conditionally, i.e. for embedded systems, you can send a pull request with your changes so next person running those tests on embedded devices will have them skipped as well.

@skipIf was exactly what I am was thinking about 😉 (Self-configuring tests dependent on the environment conditions seem a bit too advanced in the first place.)

@fkromer
Copy link
Author

fkromer commented Nov 9, 2017

Closed due to move of issue into avocado-misc-tests repository.

@fkromer fkromer closed this as completed Nov 9, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants