Skip to content

Commit

Permalink
Bug 1336272 - Refactor changing wording from resultset to push (#2644)
Browse files Browse the repository at this point in the history
Change of new environment variable `PULSE_PUSH_SOURCES`.

Keep old `publish-resultset-runnable-job-action` task name by creating a 
method that points to `publish_push_runnable_job_action`.
  • Loading branch information
maxchehab authored and camd committed Aug 4, 2017
1 parent 3648340 commit c6e0c26
Show file tree
Hide file tree
Showing 57 changed files with 506 additions and 473 deletions.
2 changes: 1 addition & 1 deletion Procfile
Expand Up @@ -7,7 +7,7 @@ worker_buildapi_4hr: newrelic-admin run-program celery worker -A treeherder --wi
worker_store_pulse_jobs: newrelic-admin run-program celery worker -A treeherder --without-gossip --without-mingle --without-heartbeat -Q store_pulse_jobs --concurrency=3
worker_store_pulse_resultsets: newrelic-admin run-program celery worker -A treeherder --without-gossip --without-mingle --without-heartbeat -Q store_pulse_resultsets --concurrency=3
worker_read_pulse_jobs: newrelic-admin run-program ./manage.py read_pulse_jobs
worker_read_pulse_resultsets: newrelic-admin run-program ./manage.py read_pulse_resultsets
worker_read_pulse_pushes: newrelic-admin run-program ./manage.py read_pulse_pushes
worker_default: newrelic-admin run-program celery worker -A treeherder --without-gossip --without-mingle --without-heartbeat -Q default,cycle_data,calculate_durations,fetch_bugs,fetch_allthethings,generate_perf_alerts,seta_analyze_failures --concurrency=3
worker_hp: newrelic-admin run-program celery worker -A treeherder --without-gossip --without-mingle --without-heartbeat -Q classification_mirroring,publish_to_pulse --concurrency=1
worker_log_parser: newrelic-admin run-program celery worker -A treeherder --without-gossip --without-mingle --without-heartbeat -Q log_parser,log_parser_fail,log_store_failure_lines,log_store_failure_lines_fail,log_crossreference_error_lines,log_crossreference_error_lines_fail,log_autoclassify,log_autoclassify_fail --maxtasksperchild=50 --concurrency=7
Expand Down
1 change: 0 additions & 1 deletion bin/run_celery_worker_buildapi
Expand Up @@ -8,4 +8,3 @@ source /etc/profile.d/treeherder.sh
exec newrelic-admin run-program celery -A treeherder worker -Q buildapi_pending,buildapi_running,buildapi_4hr,store_pulse_jobs,store_pulse_resultsets \
--concurrency=5 -l INFO \
-n buildapi.%h

Expand Up @@ -5,4 +5,4 @@ cd $SRC_DIR

source /etc/profile.d/treeherder.sh

exec newrelic-admin run-program ./manage.py read_pulse_resultsets
exec newrelic-admin run-program ./manage.py read_pulse_pushes
2 changes: 1 addition & 1 deletion docs/pulseload.rst
Expand Up @@ -41,7 +41,7 @@ Then to get those jobs loaded into Treeherder, start the periodic tasks with

celery -A treeherder worker -B -Q pushlog,store_pulse_jobs --concurrency 5

.. note:: It is important to run the ``pushlog`` queue processing as well as ``store_pulse_jobs`` because jobs that come in from pulse for which Treeherder does not already have a resultset will be skipped.
.. note:: It is important to run the ``pushlog`` queue processing as well as ``store_pulse_jobs`` because jobs that come in from pulse for which Treeherder does not already have a push will be skipped.

If you want to just run all the Treeherder *Celery* tasks to enable things like
log parsing, etc, then don't specify the ``-Q`` param and it will default to
Expand Down
2 changes: 1 addition & 1 deletion docs/rest_api.rst
Expand Up @@ -2,7 +2,7 @@ REST API
========

Treeherder provides a REST API which can be used to query for all the
resultset, job, and performance data it stores internally. To allow
push, job, and performance data it stores internally. To allow
inspection of this API, we use Swagger_, which provides a friendly
browsable interface to Treeherder's API endpoints. After setting up a
local instance of Treeherder, you can access Swagger at
Expand Down
10 changes: 5 additions & 5 deletions docs/retrieving_data.rst
Expand Up @@ -3,22 +3,22 @@ Retrieving Data

The :ref:`Python client <python-client>` also has some convenience
methods to query the Treeherder API. It is still in active development,
but already has methods for getting resultset and job information.
but already has methods for getting push and job information.

See the :ref:`Python client <python-client>` section for how to control
which Treeherder instance will be accessed by the client.

Here's a simple example which prints the start timestamp of all the
jobs associated with the last 10 result sets on mozilla-central:
jobs associated with the last 10 pushes on mozilla-central:

.. code-block:: python
from thclient import TreeherderClient
client = TreeherderClient()
resultsets = client.get_resultsets('mozilla-central') # gets last 10 by default
for resultset in resultsets:
jobs = client.get_jobs('mozilla-central', result_set_id=resultset['id'])
pushes = client.get_pushes('mozilla-central') # gets last 10 by default
for pushes in pushes:
jobs = client.get_jobs('mozilla-central', push_id=pushes['id'])
for job in jobs:
print job['start_timestamp']
4 changes: 2 additions & 2 deletions docs/submitting_data.rst
Expand Up @@ -127,7 +127,7 @@ Using the Python Client
-----------------------

There are two types of data structures you can submit with the :ref:`Python client
<python-client>`: job and resultset collections. The client provides methods
<python-client>`: job and push collections. The client provides methods
for building a data structure that treeherder will accept. Data
structures can be extended with new properties as needed, there is a
minimal validation protocol applied that confirms the bare minimum
Expand All @@ -144,7 +144,7 @@ Job Collections

Job collections can contain test results from any kind of test. The
`revision` provided should match the associated `revision` in the
resultset structure. The `revision` is the top-most revision in the push.
push structure. The `revision` is the top-most revision in the push.
The `job_guid` provided can be any unique string of 50
characters at most. A job collection has the following data structure.

Expand Down
78 changes: 39 additions & 39 deletions docs/testcases.rst
Expand Up @@ -11,15 +11,15 @@ Load Treeherder. eg.

Depending on your test requirement.

**Expected**: Page loads displaying resultsets pushed to mozilla-inbound.
**Expected**: Page loads displaying pushes pushed to mozilla-inbound.

Treeherder logo > Perfherder

**Expected**: Perfherder loads displaying its initial Graph page.

Perfherder logo > Treeherder

**Expected**: Treeherder loads again, displaying resultsets per step 1.
**Expected**: Treeherder loads again, displaying pushes per step 1.

Check Job details Tab selection
------
Expand Down Expand Up @@ -126,11 +126,11 @@ Switch repos
------
Click on the Repos menu, select a different repo.

**Expected**: The new repo and its resultsets should load.
**Expected**: The new repo and its pushes should load.

Reverse the process, and switch back.

**Expected**: The original repo and resultsets should load.
**Expected**: The original repo and pushes should load.

Toggle unclassified failures
------
Expand Down Expand Up @@ -162,9 +162,9 @@ Select any job and click on the adjacent "(sig)" signature link.

**Expected**: Ensure only jobs using that unique signature SHA are visible.

Pin all visible jobs in resultset
Pin all visible jobs in push
------
Click on the Pin 'all' pin-icon in the right hand side of any resultset bar.
Click on the Pin 'all' pin-icon in the right hand side of any push bar.

**Expected**: Up to a maximum of 500 jobs should be pinned, and a matching notification warning should appear if exceeded.

Expand Down Expand Up @@ -222,25 +222,25 @@ Select any completed job and click the raw log button in the lower navbar.

**Expected**: The raw log for that job should load in a new tab.

View resultsets by Author
View pushes by Author
------
Click on the Author email (eg. ryanvm@gmail.com) in a resultset bar.
Click on the Author email (eg. ryanvm@gmail.com) in a push bar.

**Expected**: Only resultsets pushed by that Author should appear.
**Expected**: Only pushes pushed by that Author should appear.

Get next 10| resultsets via the main page footer.
Get next 10| pushes via the main page footer.

**Expected**: Only resultsets from that Author should be added.
**Expected**: Only pushes from that Author should be added.

View a single resultset
View a single push
------
Load Treeherder and click on the 'Date' on the left side of any resultset.
Load Treeherder and click on the 'Date' on the left side of any push.

**Expected**: Only that resultset should load, with an accompanying URL param "&revision=(SHA)"
**Expected**: Only that push should load, with an accompanying URL param "&revision=(SHA)"

(optional) Wait a minute or two for ingestion updates.

**Expected**: Only newly started jobs for that same resultset (if any have occurred) should appear. No new resultsets should load.
**Expected**: Only newly started jobs for that same push (if any have occurred) should appear. No new pushes should load.

Quick Filter input field
------
Expand All @@ -256,63 +256,63 @@ Click the grey (x) 'Clear this filter' icon the right hand side of the input fie

**Expected**: Filter should be cleared and input should shrink to original width.

Check resultset actions menu
Check push actions menu
------
From any resultset bar, select each entry in the far right dropdown that doesn't involve retriggers. eg:
From any push bar, select each entry in the far right dropdown that doesn't involve retriggers. eg:

Bugherder,
BuildAPI,
Revision URL List

**Expected**: Each should open without error or hanging.

Get next 10|20|50 resultsets
Get next 10|20|50 pushes
------
Click on Get next 10| resultsets.
Click on Get next 10| pushes.

**Expected**: Ensure exactly 10 additional resultsets were loaded.
**Expected**: Ensure exactly 10 additional pushes were loaded.

Click on Get next 50| resultsets.
Click on Get next 50| pushes.

**Expected**: Ensure the page has a reasonable load time of ~10 seconds.

View a single resultset via its Date link. Click Get next 10| resultsets.
View a single push via its Date link. Click Get next 10| pushes.

**Expected**: Ensure the page loads the 10 prior resultsets and the "tochange" and "fromchange" in the url appear correct.
**Expected**: Ensure the page loads the 10 prior pushes and the "tochange" and "fromchange" in the url appear correct.

Filter resultsets by URL fromchange, tochange
Filter pushes by URL fromchange, tochange
------
See also the Treeherder `userguide`_ for URL Query String Parameters. Please test variants and perform exploratory testing as top/bottom of range is new functionality (Jun 3, 15')
Navigate to the 2nd resultset loaded, from the resultset action menu select 'Set as top of range'.
Navigate to the 2nd push loaded, from the push action menu select 'Set as top of range'.

**Expected**: Ensure: (1) 1st resultset is omitted (2) url contains `&tochange=SHA` and (3) ten resultsets are loaded from that new top
**Expected**: Ensure: (1) 1st push is omitted (2) url contains `&tochange=SHA` and (3) ten pushes are loaded from that new top

Navigate to the 3rd resultset loaded and select 'Set as bottom of range'
Navigate to the 3rd push loaded and select 'Set as bottom of range'

**Expected**: Ensure (1) only the 3 ranged resultsets are loaded (2) url contains '&tochange=[top-SHA]&fromchange=[bottom-SHA]'
**Expected**: Ensure (1) only the 3 ranged pushes are loaded (2) url contains '&tochange=[top-SHA]&fromchange=[bottom-SHA]'

Click Get Next | 10 in the page footer.

**Expected**: Ensure 10 additional pages load for a total of 13 resultsets.
**Expected**: Ensure 10 additional pages load for a total of 13 pushes.

(optional) wait a minute or two for job and resultset updates
(optional) wait a minute or two for job and push updates

**Expected**: Updates should only occur for the visible resultsets. No new resultsets should appear.
**Expected**: Updates should only occur for the visible pushes. No new pushes should appear.

Filter resultsets by URL date range
Filter pushes by URL date range
------
See also the Treeherder `userguide`_ for URL Query String Parameters
Add a revision range to the URL in the format, eg:

&startdate=2015-09-28&enddate=2015-09-28

Warning: With the latest volume of jobs and resultsets, anything greater than a single day window risks loading too much data for the browser with Treeherder default filter and exclusion settings.
Warning: With the latest volume of jobs and pushes, anything greater than a single day window risks loading too much data for the browser with Treeherder default filter and exclusion settings.

**Expected**: Resultsets loaded should honor that range.
**Expected**: pushes loaded should honor that range.

(Optional) Wait for new pushes to that repo.

**Expected**: Resultsets loaded should continue to honor that range.
**Expected**: pushes loaded should continue to honor that range.

Modify Exclusion Profiles in the Sheriff panel
------
Expand Down Expand Up @@ -380,7 +380,7 @@ Check all keyboard shortcut functionality as listed in the `userguide`_.

Job counts
------
In any resultset with job counts, click on the group button eg. B( ) to expand the count.
In any push with job counts, click on the group button eg. B( ) to expand the count.

**Expected**: Jobs should appear.

Expand All @@ -394,18 +394,18 @@ Click in empty space to deselect the collapsed job.

Click on the ( + ) global Expand/Collapse icon in the navbar to toggle all +n counts.

**Expected**: Counts should expand and collapse on all visible resultsets.
**Expected**: Counts should expand and collapse on all visible pushes.

Navigate via the n,p and left/right keys.

**Expected**: +n counts should be skipped during navigation.

expand all the groups, (the url querystring will reflect this) then reload the page

**Expected**: groups should still be expanded for all resultsets
**Expected**: groups should still be expanded for all pushes

Optional: There are other variants that can be tested: classification of expanded job count members, Filters, and any other workflow integration testing.

.. _`stage`: https://treeherder.allizom.org
.. _`production`: https://treeherder.mozilla.org
.. _`userguide`: https://treeherder.allizom.org/userguide.html
.. _`userguide`: https://treeherder.mozilla.org/userguide.html
File renamed without changes.
33 changes: 24 additions & 9 deletions tests/client/test_treeherder_client.py
Expand Up @@ -194,10 +194,10 @@ class TreeherderClientTest(DataSetup, unittest.TestCase):
{"jobDetail2": 2},
{"jobDetail3": 3}
]
RESULTSETS = [{"resultSet1": 1},
{"resultSet2": 2},
{"resultSet3": 3}
]
PUSHES = [{"push1": 1},
{"push2": 2},
{"push3": 3}
]

@responses.activate
def test_post_job_collection(self):
Expand Down Expand Up @@ -256,20 +256,35 @@ def test_get_job(self):
self.assertEqual(len(jobs), 3)
self.assertEqual(jobs, self.JOB_RESULTS)

@responses.activate
def test_get_pushes(self):
tdc = TreeherderClient()
url = tdc._get_endpoint_url(tdc.PUSH_ENDPOINT, project='mozilla-inbound')
content = {
"meta": {"count": 3, "repository": "mozilla-inbound",
"offset": 0},
"results": self.PUSHES
}
responses.add(responses.GET, url, json=content, match_querystring=True, status=200)

pushes = tdc.get_pushes("mozilla-inbound")
self.assertEqual(len(pushes), 3)
self.assertEqual(pushes, self.PUSHES)

@responses.activate
def test_get_results(self):
tdc = TreeherderClient()
url = tdc._get_endpoint_url(tdc.RESULTSET_ENDPOINT, project='mozilla-inbound')
url = tdc._get_endpoint_url(tdc.PUSH_ENDPOINT, project='mozilla-inbound')
content = {
"meta": {"count": 3, "repository": "mozilla-inbound",
"offset": 0},
"results": self.RESULTSETS
"results": self.PUSHES
}
responses.add(responses.GET, url, json=content, match_querystring=True, status=200)

resultsets = tdc.get_resultsets("mozilla-inbound")
self.assertEqual(len(resultsets), 3)
self.assertEqual(resultsets, self.RESULTSETS)
pushes = tdc.get_resultsets("mozilla-inbound")
self.assertEqual(len(pushes), 3)
self.assertEqual(pushes, self.PUSHES)


if __name__ == '__main__':
Expand Down

0 comments on commit c6e0c26

Please sign in to comment.