Skip to content

Improving Code Quality by Integrating QA Testers with Analysts, designers, developers and other stakeholders

NormanChiflen edited this page Feb 14, 2012 · 2 revisions

QA-Test collaboration with Analyst, designers, developers and other stakeholders

Table of Contents

KICK OFF PROJECT

DURING DESIGN

DURING DEVELOPMENT

DURING BETA DEPLOY

.

.

KICK OFF PROJECT

  1. Project manager assesses the concept document, (idea of product) conducts a Feasibility study (research on how feasible it is, e.g does technology allow implementation, what will be main devices/OS, will you need to sign application, costs, incomes, http vc socket connections, risks, etc).

  2. If the project is realistic, create specification document. The initial stages of the project will be a statement of requirements (from client/analyst/management).

  3. Analysis of requirements in the context of existing work practice (analyst + end user), development of functional spec (analyst), walk through and sign off of spec (whole team).

  4. QA gets involved at various stages in the software life cycle. QA involves more than testing. Walkthrough of the spec is the first, and possibly most important, QA element. Will proposed solution deliver stated requirements while being usable for user? Spec details app integration with device & data definitions, target environment, storage and memory resources. QA produces the Impact Assessment based on CCD/CPD(https://groupshare.pri.o2.com/sites/CCD/default.aspx)

  5. Together with analysts/requirements team, we define product backlog in the sprint planning meeting. Team members will be very clear about the goal of current sprint (sprint cards).

  6. Every backlog makes allowance for design/testing in it’s estimation (storypoints).

  7. QA Creates Checklist/guidelines and submit to the design/dev team.

  8. Encourage pair programming and code review to ensure that designers/developers keep their eye on checklist to maintain coding standards and conventions thus reducing the number of bugs from development. We aim to receive minimum number of bugs from design/development.

DURING DESIGN

  1. Designers start planning, implementing requirements(input) then pass artefacts(outputs) onto dev & test team. .

  2. QA verifies the design as follows: UI:Layout standards vs each platform’s Human Interface Guidelines UX: Check controls, goals-based test, ‘self documenting’ based on Human interface guidelines (HIG) standards for required platform.

DURING DEVELOPMENT

  1. Dev team commence development, taking input from design+ test team simultaneously. Most of the actual development will be done as pair programming. This will improve the quality of the code and low level architecture, which are important aspects for the customer. As was the case with scrum, we are not doing these practices by the book per se, but applying them for our specific situation. In this case, we have Jr.-Jr. pairs doing the pair programming while the Sr. developer provides them with guidance in addition to doing other development himself.

  2. Before developer checks the code into repository:

. i. dev must ensure that all conflicts are resolved (if necessary),

ii. merge into the right sandbox and tag (if necessary),

iii. ensure check-ins have sensible comments explaining the reason for the change + issue track ID to link check-in with ticket directly(see best practice).

iv. Compile code successfully - debug.

v. Fix all urgent/critical warnings in the console and report any minor warnings.

vi. Run Unit tests (automated).

vii. Ask a developer to quickly check code or if the change is critical, ask QA to schedule a formal code review session.

viii. Check code into repository (github).

ix. Check of continuous integration build on server is successful.

x. Check that the server produces the right artifacts/deliverables.

xi. Update ticket – jira/pivotal/redmine.

xii. If you haven’t got an auto notification, Inform QA/test to run test suite and code inspection/static analysis tools.

. 13. At the same time Quality/test team start to develop test cases for the backlogs, taking input from development team.

  1. QA team will not wait until all backlog gets completed. Testing and QC operation will interact with the development (pair programming) to speed up the process, collaboration prevent misinterpretation or requirements (QA should be able to clarify req misunderstandings on user’s behalf).

  2. Development Phase testing activites include:

. a) Test plan & design

b) Developing test cases - methods, procedures and results.

c) Implement test cases.

d) Update the test documentation with every new build like (test topic, area, steps, test result, if bug then bug id and link).

e) At the end of development, during Quality validation, QA will use SmartGit to get clean source code from Github, run Continuous Integration build pipeline (xcode for iphone, vs10 for windows, eclipse-maven for android), execute Unit test.

f) Then run the following tests: .

• • QA and designers run a usability test- lab based, diary studies or paper-protypes. Usability testing session will be arranged during the second iteration to ensure the usability of the application. A test case will be designed for this session by the QA-manager. We will give tasks to the test subject to complete with the application and gather video data about the session and make notes about errors that the users make. There will be no guidance for the users besides the task descriptions.

• Unit tests: Testing the “atomic” units - classes, Web pages, new code independently. (Dev).

• Upgrade/Sanity/Regression Test: Installation/un-installation of .apk/.api/.xap (bluetooth, OTA, IRDA, DATA CABLE), cover all the modules of your application, back key mechanism, network related issue (GPRS,wap session,Access point), Concurrency(Static & dynamic screen), call monitor, abnormal exit, Abuse, low battery, low disc space, time out and retries.

• Exploratory testing will be done during the last week of each sprint excluding the first sprint. The QA manager will provide test charters to the testers and the system is tested as a whole. The defects detected are reported to JIRA by the person who finds the defect.

• Integration tests: Test interaction of units- parallel run on dummy server (Tester & Dev).

• System tests: use profiler to check for memory and resource leaks, Test the whole, integrated system. Memory is one of the most precious thing on a mobile devices because we don’t have it available as much as we get in the desktop . So test to identify the memory footprints and memory leaks (Dedicated team). QA tests against generic version to find all bugs and check if everything is developed according to specifications. (emulator could be used to test generic version). Test cases should cover everything written in specification.

• Acceptance tests: “Real-world” tests – testing under conditions that are as close to the “live” environment as possible (Client/QA/analyst). Acceptance testing is to be done immediately after each sprint in a demo session. The goal is to find out if the features implemented are of sufficient quality. If they are not, the related tasks are reopened or defects are reported to JIRA and further development/fixing will be done in the following sprint. After generic version is in acceptable quality, start creating device specific builds (for example one build for SE JP-7, SE JP-8, Nokia S60v1). Do testing at least on each device from one group, for example one S60v1 device. If you have resources do testing on all devices.

• Beta tests: Informal, product-wide tests conducted by “friendly” users (see deploy below). (http://www.slideshare.net/UdayaSree/software-testing-life-cycle-presentation)

.

  1. As the software evolves in functionality towards function-complete, the developer reviews and modifies the automation tour into discernible unit-tests with hard-assert properties. This achieves a steady increase in granularity and tighter Pass conditions towards a release candidate as run by the QA resource, who consistently reports issues found in the tests as they are run.

  2. Team runs Regression & functional testing for the new features.

  3. If Test unravels bugs, post bug info in rally/assembla or any suitable tracker and assign to correct developer. Every bug will be in active status. When Dev team fix and review bugs, they can change the status to resolve adding (build number and a text reason for updating status). This triggers an email to QA to re-run test, repeat until QA certified.

  4. No one else but QA should be allowed to close tickets to maintain bug lifecycle using tools like rally/QC.

  5. Build-in screen capture of iOS and Snagit on Window 7 to capture screen.

  6. Skype, gmail, daily meeting to communicate among the QA team.

  7. Schedule a demo/show me-tell me session at least once a week.

  8. Send iteration report to stakeholders for bug statistics and project status summary (LATEST CODE QUALITY REPORT HERE).

  9. Finally QA ensures that the project has a sound project plan, thoroughly reviewed requirements, very well written design docs, a sound Test Plan.

.

DURING BETA DEPLOY

. If you are making a high quality app, beta testers can help alot by requesting features that you may not have thought of. By implementing some of these features before launch, your app can be much higher quality.

  1. Use QA script to:

i. Get Jenkins (build server) to Poll github, If a change is detected in the codebase, a build is initiated.

ii. An .ipa is created from the built app and the provisioning profile is embedded.

iii. A .zip file is created with the .ipa and the provisioning profile for legacy iOSx users.

iv. The plist manifest file and the index.html web page is autogenerated.

v. All of these files plus the .dSYM directory are then copied across to specified web server and versioned by the current Git hash.

  1. Conduct an Over the air beta Test (UDID) or use third parties - testflight or hockey.

  2. With TestFlight, here's how it works:

. i. You register your device, by clicking the Login button above (make sure you're on your device!).

ii. Developers receive your device information, and prepare builds for your device.

iii. You get notified by TestFlight when there are new builds available for your device to download.

iv. Users install the build, test the app, and leave great feedback!

.

  1. Use marketplace submission checklist to finalise app.

  2. Clear ALL priority 1 and 2 tickets (showstoppers).

  3. The solution is then functionally tested by the QA, usability tested by the end user, and hammered by the whole team .to see if it can be broken. It is then given to the end user, who uses it in parallel with his existing (possibly manual) system, for a trial period. All going well, the other users are trained up, if not iterate until it works.

  4. Submit app to android, apple or windows marketplace.

Something went wrong with that request. Please try again.