Perceived UX of Web Apps in the Wild : Benchmark data, exploratory analyses and insights from large-scale crowd sourcing
Jupyter Notebook
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
Phase-1
LICENSE
README.md

README.md

SpeedPerception: Perceived UX of Web Apps in the Wild

SpeedPerception / Latest Updates:

SpeedPerception Phase2 is live now. URL for the challenge: http://speedperception.com/

  • SpeedPerception Phase-2 is looking at IR-500 + Alexa-1000 / Chrome / ChromeMobile / Desktop / Mobile
  • SpeedPerception Phase-1 looked at IR-500 / Chrome / Desktop

Phase-1 overview and a few key insights : http://dl.acm.org/citation.cfm?id=3098606 (Winner of ACM-SigComm Internet QoE best paper award !!)

What is SpeedPerception?:

SpeedPerception is a large-scale web performance crowdsourcing study focused on the perceived loading performance of above-the-fold content. Clearly, no one likes slow loading webpages. SpeedPerception is a study trying to understand what “slow” and “fast” mean to the human end-user and how these perceptions are affected by the structure of the web applications. Traditional web performance metrics defined in W3C standards focus on timing each process along the content delivery pipeline, such as Time to First Byte (TTFB) and Page Load Time. We want to tackle the web performance measurement challenge by looking at it from a different angle: one which puts user experience into focus. Since people primarily consume the web visually, we are focusing on the visual perception of the webpage loading process.

SpeedPerception / Goal:

Our goal is to create free, open-source, benchmark dataset(s) to advance the systematic study of how human end-users perceive the webpage loading process: the above-the-fold rendering in particular. Our belief (and hope) is that such benchmarks can provide a quantitative basis to compare different algorithms and spur computer scientists to make progress on helping quantify perceived webpage performance.

SpeedPerception / Team :

Qingzhu (Clark) Gao - Data Scientist @ Instart Logic
Parvez Ahammad - Head of Data Science & Machine Learning @ Instart Logic
Prasenjit Dey - Software Engineer @ Instart Logic

SpeedPerception / Collaborators :

Pat Meenan - Staff Engineer @ Google / Creator of http://WebPagetest.org
Estelle Weyl - Open Web Evangelist @ Instart Logic

SpeedPerception / Phase-1 :

Phase-1 results: https://github.com/pahammad/SpeedPerception/tree/master/Phase-1
Phase-1 Web app code / Experimental Design Criteria: https://github.com/pdey/SpeedPerceptionApp
Phase-1 crowd-sourcing challenge was hosted at http://speedperception.com/
Phase-1 crowd sourcing duration: 28th July 2016 to 30th September 2016.