Skip to content

Integrated Test Management

Shawn McFadden edited this page Jun 12, 2024 · 1 revision

Master Test Strategy

The MMIS Modernization Project will use industry standards in the development of quality test processes. Module vendors will utilize effective communications and signoffs with IDHW stakeholders and business teams on plans, test cases and results, gate checks, and key milestones to ensure the MMIS Modernization test approach is successful in its implementation.

Test Strategy Development

The foundational work of defining and documenting the as-is business processes, high-level business requirements (current/future) development, and the requirements documentation to support the MMIS Modernization project, that will be instrumental in the test strategy, began based on the following activities:

Requirements (Identification/Review/Document):

  • Lessons learned interview sessions were held with a variety of individuals to obtain information related to past procurement as well as current concerns. Information obtained was analyzed and loaded to DevOps
  • Numerous discovery sessions were held, based on the CMS Medicaid Business processes descriptions, with SMEs from each of the bureaus/areas to identify current As-Is business processes within IDHW. Input from these sessions resulted in the creation of the As-Is business flow diagrams that were reviewed and approved by the various bureaus/areas
  • Bureaus/areas were engaged in working sessions to document business process and system pain points for inclusion as “to-be” requirements in individual MMIS procurement Invitation to Negotiate (ITN) and/or Request for Proposal (RFP). All information obtained was documented within DevOps
  • NASPO requirements for the Claims Management and Provider Management modules were reviewed, extracted, and loaded to DevOps in preparation for requirement analysis and approval for individual ITN or RFP inclusion
  • Review of existing IDHW SLA’s as well as other states MMIS SLA’s; information was loaded to DevOps
  • On-site requirement review sessions were held in September 2022 to perform initial (first cut) review of requirements containing lessons learned, As-Is, To-Be, SLA’s, NASPO, and any other states requirements that were identified and may be of value to the IDHW Modernization project
  • A master list of requirements was extracted, following onsite review, and distributed to bureaus/areas for formal review and approval

The process identified above provides a starting point for the test case development and test case-torequirements traceability. Using this information, the state can develop a comprehensive testing strategy for the MMIS Modernization project.

Integrated Test Management Plan Strategy

The MMIS Modernization Projects will use industry standards in the development of quality test processes. Module vendors will utilize effective communications and signoffs with IDHW stakeholders and business teams on plans and test results, gate checks, and key milestones to ensure the MMIS Modernization test approach is successful in its implementation. Throughout the MMIS modernization project, tasks and project plans will require review and revision to facilitate activities for MMIS implementation testing.

Test Case Planning and Development

For vendors to fully understand the MMIS Modernization Project, the SI Advisory Vendor and EQC will facilitate testing strategy sessions between module vendors and IDHW stakeholders to review applicable project documentations, including but not limited to business flows, design documents, and requirements. The documentation will be leveraged to develop test scenarios and test cases that represent the full functionality of the MMIS Modernization Project.

High-level use case scenarios will be provided to IDHW teams and partners at an agreed-upon timeframe prior to scheduled meetings for review and approval. Using the approved high-level scenarios, the SI Advisory Vendor will work with all module vendors and partners to map system test and module integration testing test cases. Test cases will have detailed steps with all necessary information including expected results.

The MMIS Modernization Project will utilize Microsoft Azure DevOps as a centralized testing tool for test case development and tracking.

Test Scenarios and Test Cases Preparation

Preparing test scenarios and test cases is an essential part of the planning effort. The SI Advisory Vendor, EQC, Module vendors and IDHW stakeholders will need to address how they will coordinate the following test scenario/case tasks:

  • Review the high-level requirements and detailed system designs
  • Collaborate to gain understanding of expectations required to develop test scenarios and test cases
  • Develop and review test scenarios based on high-level requirements
  • Collaborate to ensure developed scenarios align with IDHW expectations
  • Gain buy-in and support with the development of test scenarios
  • Work to create test cases based on the approved scenarios.
  • Work to ensure steps identified in test cases are accurate to module design and assist in identifying module dependencies for integration case development
  • Review system test and module integration test cases for reusability and to identify gaps for the creation of additional test cases
  • Present scenarios and test cases for review and approval

Traceability Management

Traceability Management is designed to make certain all functional requirements are validated during the testing effort. MMIS Modernization Project will use Microsoft Azure DevOps to perform requirements traceability during all phases of the project. This traceability is designed to make certain all functional requirements including integration with other modules is validated during the testing effort. The RQMP defines the approach to manage requirements throughout the life of the project, including during the various testing phases through to module certification. The RQMP is located on the IDHW SharePoint.

Integrated Test Case Management

The MMIS Modernization Project approach and schedule will require that multiple modules be tested at the same time. There will also be functionality and test scenarios that will require integration between multiple modules. As a result, there will be test scenarios developed to verify complete functionality of the application and its modules. The SI Advisory Vendor in collaboration with the EQC will be responsible for the strategic alignment and coordination amongst IDHW stakeholders and module vendors of the following test case management activities:

  • Reviewing and understanding each module vendor’s IMS identifying testing tasks of all module releases and builds
  • Determination of when each module will be available for integration testing
  • A schedule that identifies functionalities being tested between modules per release to assist in test case prioritization
  • Distribution from module vendor test teams of their master schedule outlining the release of system tests and module integration test cases needed for UAT reusability
  • Collaboration with module vendors to identify and modify (where needed) integration test cases that can be reused during UAT
  • Schedule meetings between module test teams to work collaboratively to validate scenarios and test cases cover specific intersections and actors between modules for UAT testing
  • Coordination between IDHW managers and leads from each module, as well as the module vendor test leads, is necessary to develop a timeline that accommodates the various design, development, and test case preparation activities involved in the project

Test Environments

During the MMIS Modernization Project, the testing infrastructure must be available in accordance with testing timelines outlined in the project plan. Dedicated testing environments will need to be set up and maintained by the Module Vendors to facilitate all testing cycles. Testing environments will be refreshed, updated, and synchronized according to the rules defined in each Module Vendor’s MTP and in accordance with the IMS. Each vendor will be required to document in their security plans their approach to, and controls for, managing Personal Health Information (PHI)/Personally Identifiable Information (PII) in any of their environments. The appropriate use of PHI/PII for testing and the processes for deidentifying will be determined as the project matures. Table 5 provides a sample of planned testing environments that may be required as part of the MMIS Modernization Project:

Testing Processes

The SI Advisory Vendor and EQC is responsible for monitoring the Vendor’s module testing in all test phases. The SI Advisory Vendor, EQC, and module vendors will collaborate to identify all stakeholders needed for testing. The SI Advisory Vendor and EQC will monitor test phase entrance and exit criteria, as well as defects resulting from testing and the time it takes to resolve them to ensure contractual SLAs are met. Monitoring and Support test-related activities include:

  • Stakeholders Identification/coordination, and others needed to support Vendor module testing
  • Validation/verification of test cases
  • ST and SIT defects monitoring
  • Test case execution and adherence to test phase entrance and exit gate/criteria

As the project matures, governance is fully established, and procurements complete, testing processes will be better understood, and this section will be updated to describe the overarching cross-module testing processes.

Vendor Test Expectations

Module vendors are expected to develop and maintain documentation regarding vendor testing process tasks, including test methods, inputs, outputs, schedule, and risks and assumptions. Documents will include a link to a designated IDHW SharePoint folder where test plans will reside. In most cases, individual module test processes reside in the module vendor work plans and test plan deliverables. IDHW expects vendors to have established testing processes. IDHW expects those testing processes result in documentation that is acceptable to, and approvable by IDHW.

Testing Tools and Document Management

This section describes the planned testing and document management tools to be used during MMIS Modernization project testing. As the project matures, governance is fully established, and procurements complete, testing tools and document management will be better understood, and this section will be updated

Idaho Shared Repository

The MMIS Modernization Project will use Microsoft SharePoint as a shared repository to support the testing effort. SharePoint will house documentation that supports the testing effort including test plans, schedules, and other related documentation. Work items in IDHW DevOps will securely link to supporting documents (e.g., test results) in Microsoft SharePoint. NOTE: Neither Microsoft Azure DevOps nor the IDHW SharePoint site should contain any PHI or PII, all test data must be de-identified.

Planned Test Reports

Informative, accurate test reports are a major factor in the success of test efforts. While each module vendor will have their own existing test reports that will need to meet the approval of IDHW, it will also be required that the module vendors support a cross-module test progress reporting framework. The purpose of this cross-module report will be to provide IDHW with a project-wide view of status of the test efforts. IDHW, the EQC, and the SI will determine the data needed to support this approach. The data and methodology for that support will be determined as modules are procured for the project. Additionally, each vendor is expected to generate test reports from their own IDHW-approved testing frameworks. The specific reports and the data to include in them will be determined with each vendor, in collaboration with IDHW, the EQC, and the SI.

Security Testing

This section describes how the MMIS Modernization Project will conduct security testing to identify threats, risks, or any other vulnerabilities present in each module. This section will detail the baseline expectations for security testing of the software modules being implemented, which may include vulnerability scanning, security scanning, penetration testing, and security auditing.

The MMIS Modernization Project will follow the privacy and security guidelines in the IDHW Privacy and Security Plan (PSP). Accordingly, testing will include security testing on each module to ensure compliance with the IDHW PSP. In accordance with regulatory entities, established policies, periodic monitoring, security assessments are performed to detect compliance and risk issues. Security assessments also provide assurance that sensitive data and other information has not been compromised, and ensure safeguards are reasonable, appropriate, implemented, effective, and operational, as intended. The IDHW PSP describes the minimum types of security testing that will be expected of the software modules implemented as part of the MMIS Modernization.

Performance Testing

The purpose of Performance Testing is to assess whether the system, as built and deployed, maintains adequate throughput, satisfactory response, and timely completion of operations under different conditions of volume and stress over a designated period. Performance testing also determines whether, or at what point, extreme conditions are likely to cause a system to fail. Each module vendor is responsible for conducting Performance Testing as part of their system testing lifecycle.

The SI Advisory Vendor will provide oversight to ensure all module vendors comply with performance testing standards for the project. Module Vendors will be expected to prepare and execute performance testing, at minimum, within the parameters described below:

  • Module Vendors are responsible for performance testing and publishing reports.
  • The SI Advisory Vendor is responsible for coordinating a true end-to-end performance test across all modules, and external partners, to validate the performance for a production like environment and will influence a Go-Live decision.
  • A dedicated performance test environment is to be established and performance should be an independent test activity that is conducted post successful module integration testing and concurrent with UAT.
  • Performance testing environment code base/Release version should be matched to the UAT environment and periodic code updates are to be applied to the performance environment to match with UAT environment.
  • The infrastructure/configuration of Performance Test environments should match a production like environment.
  • Each respective module vendor will be responsible for providing reporting metrics for individual module performance testing.
  • Performance testing should meet standard KPIs as defined for the module

UAT Testing

The UAT testing plan will be developed by the EQC in collaboration with IDHW. As UAT testing needs are identified and processes established specific module vendor responsibilities and activities will be outlined here.

Accessibility Testing

Accessibility Testing is the process to ensure the module solution abides by all American with Disabilities Act (ADA) compliance standards. Accessibility Testing is the responsibility of each Module Vendor, with results being validated by the SI Advisory Vendor and EQC. Vendor Modules must conform to accessibility standards established under Federal Civil Rights laws and Section 508 of the Rehabilitation Act. Additionally, vendor training materials must also meet all relevant accessibility standards before they are released. Certification and KPI Reporting Support

Certification and KPI Reporting Support

The MMIS Modernization Project testing effort will need to support CMS certification of the IDHW MMIS enterprise and SMC. As part of testing, the project will need to include information on what needs to be captured during the testing and planning phases to support certification so that the State and its vendors can demonstrate achievement of Outcomes-Based Certification. The results will provide an opportunity for the State to determine the best approach for reporting outcomes-based data to CMS during the operational phase of the MMIS.