Skip to content
Permalink
Browse files

Disable WMS server test

  • Loading branch information
m-kuhn committed Nov 19, 2017
1 parent af6b4cc commit 8f1021c5b8c1ec6062fe730c49229bd1e85596a8
Showing with 1 addition and 0 deletions.
  1. +1 −0 .ci/travis/linux/blacklist.txt
@@ -29,6 +29,7 @@ PyQgsSpatialiteProvider

# Flaky, see https://travis-ci.org/qgis/QGIS/jobs/297708174
PyQgsServerAccessControl
PyQgsServerWMS

# Need a local postgres installation
PyQgsAuthManagerPKIPostgresTest

4 comments on commit 8f1021c

@m-kuhn

This comment has been minimized.

Copy link
Member Author

@m-kuhn m-kuhn replied Nov 19, 2017

I had to disable this test as it was becoming really unstable recently. I'm sorry, I'm not able to fix it myself and it's currently giving many false alarms.

Can someone take over to re-enable (parts of) it?

It looks like it has been failing repeatedly since #5664 (See https://travis-ci.org/qgis/QGIS/builds) but since the PR itself was green and it was already failing quite often before, I'm not convinced, there's a connection between PR and test results.

Thanks a lot

@elpaso @pblottiere @rldhont

@pblottiere

This comment has been minimized.

Copy link
Member

@pblottiere pblottiere replied Nov 20, 2017

Argh... I thought that the commit of Even on Spatialite provider (581d0d3) had stabilized these tests...

BTW, according to Travis, it seems that it's a real error this time (and not some flakyness):

======================================================================

FAIL: test_wms_GetProjectSettings_wms_print_layers (__main__.TestQgsServerWMS)

----------------------------------------------------------------------

Traceback (most recent call last):

  File "/root/QGIS/tests/src/python/test_qgsserver_wms.py", line 1964, in test_wms_GetProjectSettings_wms_print_layers

    self.assertTrue(xmlResult.find("<WMSBackgroundLayer>1</WMSBackgroundLayer>") != -1)

AssertionError: False is not true

I take a look to fix it. Moreover, I'm gonna split these WMS tests for getmap, getprint, and so on. This way, we'll be able to deactivate only some tests (and not all of them).

@elpaso

This comment has been minimized.

Copy link
Contributor

@elpaso elpaso replied Nov 20, 2017

@pblottiere no: unfortunately the test was faling almost all the times. I had a look yesterday and I found mainly tiny rendering differences, can we increase the acceptable difference threshold ?

@pblottiere

This comment has been minimized.

Copy link
Member

@pblottiere pblottiere replied Nov 20, 2017

no: unfortunately the test was faling almost all the time

:(. Can I take a look to your travis builds somewhere?

I found mainly tiny rendering differences, can we increase the acceptable difference threshold ?

OK. I'm gonna do that at the same time.

We really have to tackle these issues, it's annoying for everybody. But I didn't succeed in reproducing these issues locally so far...

Please sign in to comment.
You can’t perform that action at this time.