Skip to content
This repository has been archived by the owner on Feb 20, 2023. It is now read-only.

Gather telemetry on representative user experiences for performance testing #9069

Closed
mcomella opened this issue Mar 9, 2020 · 4 comments
Closed
Labels
performance Possible performance wins

Comments

@mcomella
Copy link
Contributor

mcomella commented Mar 9, 2020

We'd like to have telemetry to know what a typical user session looks like so we're able to create performance tests that are representative of user experiences. For example, how many tabs do users typically have open? Perhaps we should run our startup & page load tests with that number of tabs.

Acceptance criteria


@ecsmyth Please update the description if this doesn't match your expectations! If you have a list of criteria we'd want implemented, it'd be great if you could post it too.

┆Issue is synchronized with this Jira Task

@mcomella mcomella added the performance Possible performance wins label Mar 9, 2020
@mcomella mcomella added this to Needs prioritization in Performance, front-end roadmap via automation Mar 9, 2020
@mcomella mcomella changed the title Gather telemetry on representative user experiences Gather telemetry on representative user experiences for performance testing Mar 9, 2020
@mcomella mcomella moved this from Needs prioritization to Backlog (prioritized) in Performance, front-end roadmap Mar 9, 2020
@ecsmyth ecsmyth removed their assignment Mar 10, 2020
@mcomella
Copy link
Contributor Author

@ecsmyth Migration telemetry landed in mozilla-mobile/android-components#6196 – can you validate that it meets your needs?

@ecsmyth ecsmyth removed their assignment Apr 15, 2020
@pslawless
Copy link
Collaborator

What's left for us to tackle is Startup Telemetry (all others are done or in progress with other teams).

What we should focus on for Startup Telemetry:

  1. Aligns with how we measure startup in CI (as close as we can realistically make them)
  2. At granularity Vesta has defined (home screen, applink, custom tab) with how often it is cold vs warm (vs hot? not as important)
    -- P1: "how was the app opened?" (home screen, applink, custom tab)
    -- P1: cold vs warm
    -- P2: "how long did it take to open?"

@mcomella
Copy link
Contributor Author

1. Aligns with how we measure startup in CI (as close as we can realistically make them)

Will be handled in #10069

2. At granularity Vesta has defined (home screen, applink, custom tab) with how often it is cold vs warm (vs hot? not as important)
   -- P1: "how was the app opened?" (home screen, applink, custom tab)
   -- P1: cold vs warm
   -- P2: "how long did it take to open?"

These are filed as separate issues: I think this can be closed.

Performance, front-end roadmap automation moved this from Backlog (prioritized) to Done Jul 14, 2020
@mcomella
Copy link
Contributor Author

I think this can be closed.

There's an equivalent JIRA task – https://jira.mozilla.com/browse/FXP-471 – so I think we can discuss there if a meta is still necessary or helpful.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
performance Possible performance wins
Projects
None yet
Development

No branches or pull requests

3 participants