Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Track Tests based on Milestones/Tags/Deployments/Points/Markers in Time #114

Closed
StephenOTT opened this issue Apr 26, 2016 · 3 comments
Closed

Comments

@StephenOTT
Copy link

Would be great if there was a way to add a "tag" to when a test runs and track this against the graphs and test reports. This would allow you to see when Accessibility issues increase or decrease based on a specific code change.

Is there a way to do this? Could not find anything in the docs.

@josebolos
Copy link
Member

Hi @StephenOTT

Unfortunately, this is not possible at the moment. It's definitely in the roadmap though, and we're very interested in making this happen, although it will require a significant amount of work so I don't expect it to happen anytime soon.

@StephenOTT
Copy link
Author

StephenOTT commented Apr 28, 2016

@joseluisbolos I have not inspected the code heavily, but why is it a significant amount of work? Ignoring the visuals/graphs for a moment. Isn't this more about updating the data model with an extra field to store a text value that a user can use to represent/label the point in time when the test(s) are run for the specific website, and then making that field available through the API?

I assume there are complexities that are not apparent?

@rowanmanning
Copy link
Member

Going to close this as we're not prioritising feature work in Dashboard. If somebody finds this and does the work, we'll still accept a PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants