You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Would be great if there was a way to add a "tag" to when a test runs and track this against the graphs and test reports. This would allow you to see when Accessibility issues increase or decrease based on a specific code change.
Is there a way to do this? Could not find anything in the docs.
The text was updated successfully, but these errors were encountered:
Unfortunately, this is not possible at the moment. It's definitely in the roadmap though, and we're very interested in making this happen, although it will require a significant amount of work so I don't expect it to happen anytime soon.
@joseluisbolos I have not inspected the code heavily, but why is it a significant amount of work? Ignoring the visuals/graphs for a moment. Isn't this more about updating the data model with an extra field to store a text value that a user can use to represent/label the point in time when the test(s) are run for the specific website, and then making that field available through the API?
I assume there are complexities that are not apparent?
Would be great if there was a way to add a "tag" to when a test runs and track this against the graphs and test reports. This would allow you to see when Accessibility issues increase or decrease based on a specific code change.
Is there a way to do this? Could not find anything in the docs.
The text was updated successfully, but these errors were encountered: