Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
Clone this wiki locally
PostgreSQL Test Plan
The objective is to describe the Test plan of Space Walk - during the migration of the Database from the Oracle to Postgres as well as testing the application to perform against both databases for predefined performance benchmarks and all functionality.
The areas identified to Test
- Functional Testing
- Query/Stored Procedure Unit Tests
- System Testing
- Performance Test - queries
- Data Migration - Oracle -> Postgres, Postgres -> Oracle
- API Testing.
- Upgrade Testing
- Scale/Concurrency Testing (eg: large numbers of web requests at once)
Important Milestones during the Testing Process
- Application Knowledge Transfer.
- Environment Identification and Setup
- Access to Scripts and Knowledge Transfer - the scripts referred here are the available selenium scripts.
- Identify all the gaps in the existing scripts
- Priorterize the order of the modules.
- Establish a set of benchmarks for performance on Oracle.
- Communication & Reporting plan
- Plan to execute and Test modules - along with sign-offs
- Establish as set of scale benchmarks for performance on Oracle.
- The project wiki - will be the primary communication area - regarding the tasks, dates and tracking progress.
- Conference calls - twice a week - to measure progress.
- Daily and coverage during the overlap of Team - will be available on the irc channel as well as other modes identified.
Reporting & Tracking Issues
- Identify the tool to report and Track issues to closure.
- Identify the priority and severity.
- The process for the life cycle of a issue - report - plan - resolve - test – close
- Query/Stored Procedures
User Acceptance Criteria
- Identify and document the user acceptance criteria for the application.
- Includes the sign-off criteria.
Application Knowledge Transfer
- Spend 1 day - Teaching the Test resources the spacewalk application functionality.
Initial estimate is one day for a walk through and then as we identify the priority of the modules - we will go deep into the functionality and evolve the knowledge -- as we build the test cases and core team members review them.
- Identify Environment Setup:
- Identified as a 4 machine setup.
- Get further details on hardware and software requirements.
- Acquisition of the Environment.
- Setup the environment - a 2 day task.
- Setup access to RedHat to the Environment.
Identify Modules to Test and Create Functional Use/Test Cases
- Identify all the different modules to Test.
- Create the priority Order
- Create functional Use Cases and Test Cases around the Test Cases.
- Who will do this? - Ideally the Testing Team to develop
- Without a complete understanding of the application, how will EDB do this? -- This is a way to force knowledge of the application -- will track the quality of the cases by constant review from the core Team
- Will these be documented using RH test case templates? - definitely we can use the existing Test Template
- Are the test cases to be created for ALL functionality or just those functions not already documented by RH test cases? - we will build upon the existing base.
- Review -> update -> review -> sign off cycle for each module.
- The Modules identified to Test are:
- Release Engineering
- E-mail regression
- Quick Search
- Advanced Search
- Auto Errata Updates
- Errata Search
- Configuration Management
- RHN Registration
- SDC Software
- Activation Key
- Reactivation Key
- Multi Org - RHN
- Multi Org - II
- Satellite Sync & Export
- Get access to existing automated test scripts.
- Setup and run the automated test scripts.
- Knowledge Transfer on the existing Test scripts.
- Start building on the existing Test Scripts.
We need to account for manual testing using client side parts of the application. For example: Have 5000+ systems register or check-in and receive some number of updates (rpms).
- Get a sample data set.
- Identify use cases for performance testing.
- Identify the parameters of performance testing.
- Create a baseline benchmark with the existing application.
- Run the same and identify and fix - on the migrated environment.
- Identify test cases for scale and DB concurrency testing.
- Assumption - Only migration from the latest release version to the with Oracle to Postgres and Postgres to Oracle.
- Create a data set for migration.
- Migrate to Postgres.
- Run regression of the whole application.
- Migrate to Oracle from Postgres.
- Run regression of the whole application.
- Schema upgrade testing.
User Acceptance Testing
- Run a complete regression of the application.
- Perform performance and generate a score sheet.
- Performance scale/concurrency testing and generate a score sheet.
- Review and get sign-offs. This is a 2 phase:
- Module by module sign-offs
- Entire application sign-off.