Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scottx611x/run selenium on travis only #1650

Merged
merged 7 commits into from Apr 3, 2017

Conversation

scottx611x
Copy link
Member

No description provided.

@codecov-io
Copy link

codecov-io commented Mar 27, 2017

Codecov Report

Merging #1650 into develop will increase coverage by 1.61%.
The diff coverage is 100%.

Impacted file tree graph

@@             Coverage Diff             @@
##           develop    #1650      +/-   ##
===========================================
+ Coverage    37.03%   38.65%   +1.61%     
===========================================
  Files          369      369              
  Lines        24093    25007     +914     
  Branches      1260     1260              
===========================================
+ Hits          8924     9667     +743     
- Misses       15169    15340     +171
Impacted Files Coverage Δ
refinery/selenium_testing/tests.py 100% <100%> (ø) ⬆️
refinery/tool_manager/utils.py 86.46% <0%> (-10.21%) ⬇️
refinery/tool_manager/tests.py 99.69% <0%> (-0.31%) ⬇️
refinery/tool_manager/serializers.py 100% <0%> (ø) ⬆️
refinery/tool_manager/admin.py 100% <0%> (ø) ⬆️
refinery/core/tests.py 99.74% <0%> (+0.03%) ⬆️
refinery/tool_manager/models.py 94.01% <0%> (+0.46%) ⬆️
refinery/core/api.py 52.4% <0%> (+1.72%) ⬆️
...r/management/commands/generate_tool_definitions.py 54.32% <0%> (+54.32%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update ae20312...9852c5c. Read the comment docs.

@scottx611x scottx611x requested review from hackdna and removed request for hackdna March 31, 2017 13:02
@@ -26,6 +29,10 @@ class SeleniumTestBase(StaticLiveServerTestCase):

def setUp(self, site_login=True, initialize_guest=True,
public_group_needed=True):

# Start a pyvirtualdisplay for geckodriver to interact with
self.display = Display(visible=0, size=(1900, 1080))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why didn't we have this before? How was the size chosen?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We did have this before, See here: https://github.com/refinery-platform/refinery-platform/pull/1650/files/c733de5aea8aa4871c925bf8cd17bf705758b71b#diff-f4bed93e08ec88c8c01e26197a799d9bL17

I've moved this into the setUp() and the stopping of said virtual display into tearDown()

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, and how was the size parameter chosen?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hackdna I just chose a screen resolution that I envisioned would be about the norm of what most people use nowadays. If we make the resolution too small, pages wont render properly, and tests will probably fail.



@skipIf("prod" not in os.getenv("DJANGO_SETTINGS_MODULE"),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As I mentioned earlier, I think this is unnecessary as we have a way to run unit tests separately from Selenium tests. This also might trip us up down the road (why are Selenium tests are not running?). I think a more flexible solution is provided in Django 1.10 (https://docs.djangoproject.com/en/1.10/topics/testing/tools/#tagging-tests).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hackdna We don't currently have way to run unit tests separately from Selenium tests. Sure, you can run individual app's test suites, but this isn't a good solution, especially in the case of Refinery, because our apps are tightly coupled and changes in one app can break tests in another.

I think this solution is simple and works fine. If one wants to explicitly run selenium tests locally, switch to prod mode. Otherwise, selenium based tests are run on Travis every time you push.

As for Django 1.10, yes I mentioned that tagging feature to you... but again the ability to use that is out of our reach for awhile.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I still think that having different test runner behavior based on an environment variable is potentially confusing. Even if someone forgets to run all the tests locally, the CI system will catch that.

@scottx611x scottx611x merged commit 2660dca into develop Apr 3, 2017
@scottx611x scottx611x deleted the scottx611x/run_selenium_on_travis_only branch April 3, 2017 20:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants