Mobile App Performance Challenge: Native vs. HTML5 Hybrid Apps
Clone or download
Pull request Compare This branch is even with aristide-rio:master.
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
Android_App
Hybrid_app
benchmarking-tests
iOS_app
.gitignore
.gitmodules
README.md

README.md

#CMUSV SE Practicum 2013

This repository houses the work produced in the project: Mobile App Performance Challenge: Native vs. HTML5 Hybrid Apps. The contributors are: Aristide, Madhok, Prabhjot, Rashmi, and Shama.We carried out the project in the context of the CMU Silicon Valley software engineering practicum course; the sponsor is Appception inc.

##Goals The project has two goals:

  1. benchmarking performance of HTML5 UI operations and optimizations in mobile user agents.
  2. comparison of appearance , behavior, and performance of Hybrid Apps (WebView, UIWebView) versus Native Apps (iOS, Android).

Project components

The project has five major components, each covering a specific scope within the goals. The componets - and their scope - are:

  • Android Native App: This is Synonymous, a thesaurus app built for comparison with a hybrid counterpart. It's documented in its specific README file.

  • iOS Native App: This is Synonymous, a thesaurus app built for comparison with a hybrid counterpart. It's documented in its specific README file.

  • Hybrid App (Android & iOS): This is Synonymous, a thesaurus app built for comparison with a hybrid counterpart. It's documented in its specific README file.

  • Self-timing tests suite: This is a set of benchmarking tests exectuted in mobile browsers, and hybrid containers. It's documented in its specific README file.

  • Chrome developer tools timeline test suite: This is a set of benchmarking tests measuring the speed of basic style elements in Google chrome ccross several Android devices, a mac laptop, and a windows 7 laptop. It's documented in its specific README file.

    • In addition to the two benchmarking components, we provide the database of the bechmarking testing results that we collected in compressed dump format (directory link). The dump file is created using 'pg_dump', to restore to a database use the following 'pg_restore' command: pg_restore --verbose --clean --no-acl --no-owner -h localhost -U <database username> -d <destination database name> <path of the dump file>. Replace <field> with the appropriate value.