Appendix B:
MI-CIS-OIT-004
Management Instruction for Agile Independent Verification and Validation
Effective Date: 26 June 2017 Management Instruction: CIS-OIT-004
I. Purpose
This Management Instruction (MI) establishes the U.S. Citizenship and Immigration Services (USCIS) policy for the use of risk-based Independent Verification and Validation (IV&V) to inform management and make oversight decisions ensuring that Information Technology (IT) programs adhere to USCIS Management Instruction CIS-OIT-003 and its Appendices.
The primary functions of USCIS IV&V are:
• Provide transparency and accountability to the public;
• Provide timely feedback to program executors to continuously improve processes, practices, and outcomes;
• Ensure that projects deliver solutions that meet business objectives, support mission needs, and deliver value;
• Inform management and oversight bodies with an independent assessment of program execution based on data and analysis;
• Ensure compliance with regulatory requirements, USCIS Management Instructions, and Department of Homeland Security (DHS) Management Directives
To fulfill these functions, USCIS IT programs will use a risk-based IV&V approach to verify and validate the outcomes defined in Management Instruction CIS-OIT-003 and its Appendices. The IV&V process will ensure that the appropriate controls and analyses are applied to each program based on its assessed risk. The approach for each program will be agreed upon through a collaboration of program executors, USCIS management, IV&V teams, and external stakeholders, and will be documented in a new document called the Independent Assessment Plan (IAP).
II. Scope
This Management Instruction focuses on the IV&V program at USCIS but applies to all employee and contractor teams involved in the planning, development, and deployment of software and systems throughout USCIS.
Office of Information Technology
III. Authorities
The following laws, regulations, orders, policies, directives, and guidance authorize and govern this Management Instruction:
1. DHS Management Directive (MD) 102-01”Acquisition Management Directive,” and associated Instructions and Guidebooks
2. Section 5202 of the Clinger-Cohen Act of 1996
3. Office of Management and Budget (OMB) Circulars A-130 and A-11
4. 25 Point Implementation Plan to Reform Federal Information Technology Management (U.S. Chief Information Officer, December 9, 2010)
5. Contracting Guidance to Support Modular Development (OMB, June 14, 2012)
6. Memorandum on Agile Development Framework for DHS, by DHS CIO Richard A. Spires, issued June 1, 2012
7. Digital Services Playbook (https://playbook.cio.gov)
IV. Policy, Procedures, and Requirements
Except in cases where a waiver is granted by the Chief Information Officer (CIO), all systems development and maintenance projects at USCIS will require IV&V. Such projects include, for example, custom software development, Commercial Off-The-Shelf (COTS) integration and configuration, business intelligence, and reporting capabilities.
USCIS Office of Information Technology (OIT) management will ensure that appropriate training, coaching, and tools are available to facilitate the success of all projects. Teams are encouraged to work with OIT support groups to implement this Management Instruction in a manner appropriate for their particular context.
A. IV&V Approach
The IV&V approach to each release cycle of each project is based on a holistic assessment of the project, the team’s development history, the release plan, and the deployment plan. This assessment is based on the unique characteristics and measures of each assessed element. Some examples include:
• DHS program level or CIO designation for the project
• Visibility to the public
• Impact on mission critical systems
• Number of internal and external users affected
• Federal Information Processing Standard (FIPS) rating and security or privacy impacts
• Reliance on interfaces to external systems
• Development process
• Outcomes to date (e.g., technical debt, escaped defects, user satisfaction)
At the beginning of each project, the IV&V stakeholders will evaluate the project’s risk and create an Independent Assessment Plan (IAP). The IAP will be re-evaluated from time to time during the course of the project to see if it needs to be changed. The IAP will indicate what level of assessment will be conducted, what resources will be allocated, and what templates will be used for assessment. Appendix C to this document shows an example of an IAP template (the template may vary over time). The IAP will serve as a guide for IV&V as the program proceeds.
USCIS has also developed an assessment tool called the Product Quality Assessment (PQA) that will be continuously refined as USCIS OIT gains experience determining the success factors for projects. An example is shown in Appendix B of this document. This instrument will be the default template for IV&V assessments of major programs. The PQA compares the program’s practices and status to the instructions given in Management Instruction CIS-OIT-003 and in its Appendix A. The IV&V process therefore functions as a control to ensure that programs implement the direction specified in Management Instruction CIS-OIT-003 and its Appendices.
There will be a direct correlation between risk and the level of IV&V engagement. High risk programs (all level one and two programs and certain level three programs) will have embedded IV&V analysts and testers working with the development team, while low risk programs may only be audited. . High risk programs will also be assessed more frequently than low risk programs. IV&V will also evaluate and provide feedback on all Systems Engineering Lifecycle (SELC), Acquisition Lifecycle Framework (ALF), and other oversight documents and artifacts.
Key documents, artifacts, and relevant risk assessments will be revisited in each Release Planning Review (RPR) and will be updated as necessary, depending on the level of IV&V engagement, throughout the release cycle.
During project execution, IV&V teams will monitor team progress toward the outcomes set forth in USCIS Management Instruction CIS-OIT-003 and its Appendices. Depending on the needs of the project, IV&V teams may also include other activities such as:
• Sample testing for Section 508 conformance
• Code quality scanning and manual code review
• Unit test review
• Functional testing and test review
• Integration testing
• Performance testing
• End User testing
These activities will inform IV&V teams’ analysis as documented in the PQA. IV&V will execute these activities in a manner that supports agile delivery practices and methods.
B. Team Managed Deployment
In order to support best practices for DevOps and Continuous Delivery techniques, USCIS has developed a methodology called, Team Managed Deployments (TMD). In order to engage in this methodology, a program must be onboarded to TMD by the IV&V stakeholders. In assessing eligibility for TMD, IV&V stakeholders will evaluate a program’s readiness relative to CIS-OIT-003 Outcomes #7, #8, #9, and #10. The results of the evaluation and a determination for TMD certification will be conducted as part of the program’s next RPR, in accordance with Management Instruction CIS-OIT-003. Example measurements are provided in Appendix A.
C. Value Delivery
A primary function of IV&V at USCIS is to ensure projects deliver solutions that meet business objectives, support mission needs, and deliver value. This work begins in release planning, when the IV&V team works with Product Owners to ensure that teams plan to deliver capabilities that clearly meet mission needs and priorities, as outlined in the Capabilities and Constraints (CAC) document.
At RPR, IV&V ensures alignment with business needs by validating that the appropriate stakeholders (product owners, line of business executives, etc.) are present in person or by delegation and fully approve release plans. During the development cycle, IV&V monitors projects to ensure that business representatives are involved in work planning sessions, and in monitoring and participating in test efforts.
For Level 1 and other designated projects, IV&V also supports Operational Test & Evaluation (OT&E). OT&E, or Operational Testing, is a testing process that takes place on production systems with production data with real end users. Its purpose is to determine whether Key Performance Parameters (KPPs) articulated in the Measures of Effectiveness (MOEs) and Measures of Suitability (MOSs) set forth in the Operational Requirements Document (ORD) by the business sponsor at program authorization have been met, or not. Many experts consider this the ultimate test of business value, as it was what was promised when the program was authorized.
D. Questions, Comments, and Suggestions
Please address any questions, comments, or suggestions to: USCIS-QA-TEAM@uscis.dhs.gov VI. Approval
Signed:__________________________ Date:__________________________
Example Metrics
Below are examples of measurements supporting evaluation of objective outcomes. Key measurements must be agreed among program management, IV&V, and USCIS OCIO for regular reporting. The agreement on measurements must be recorded in the POP, and revisited in each RPR. This list is neither mandatory nor exhaustive, but serves as an indicator of the types of measurements that should be considered in crafting the POP for a specific program.
MI CIS-OIT-003 Outcome |
Examples of IV&V Decision Support Measurements (Trending Preferred) |
Outcome #1: Programs and projects frequently deliver valuable product |
• Quantitative measurements of program goals (business KPIs) • Number of deployments • User satisfaction • Strategic stakeholder satisfaction • Usage statistics |
Outcome #2: Value is continuously discovered and aligned to mission |
• Lead times for new functionality • Evidence of feedback being incorporated |
Outcome #3: Work flows in small batches and is validated |
• Work item flow measurements (e.g. cycle time) • Batch size • Evidence of test coordination • Evidence product demonstration feedback is incorporated in requirements |
Outcome #4: Quality is built in |
• Incident count (escaped defects) • Code quality measurements • Test coverage • Evidence of appropriate tool configuration and use |
Outcome #5: The organization continuously learns and improves |
• Amount of effort spent on improvement • Outcomes of retrospective experiments • Implementation of retrospective action items • Evidence of appropriate measurement activities • Evidence of feedback being incorporated • Evidence of practice transfer across organization |
Outcome #6: Teams collaborate across groups and roles to improve flow and remove delays |
• Continuous integration availability • Lean measurements • “Health check” assessment of team practices |
Outcome #7: Security, accessibility and other compliance constraints are embedded and verifiable |
• Cost of compliance issues to customers, users, and agency • Number of open issues • Security risk level • Section 508 risk level • Privacy risk level • Performance risk level |
Outcome #8: Consistent and repeatable processes are used across build, deploy, and test |
• Rate of broken builds, particularly in later stage gates • Number/percentage of escaped defects |
Outcome #9: The entire system is deployable at any time to any environment |
• Percentage of deployments needing rollback • Pipeline and repository unavailability incidents |
Outcome #10: The system has high reliability, availability, and serviceability |
• Incident count • Incident aging and inventory • Escaped defect count • Uptime • Production performance • Mean time to repair • Mean time between failures • Production error count |