Effective Date: June 23, 2021
Expiration Date: June 23, 2026
a. It is NASA policy that evaluation standards adhere to a common set of requirements, use Federal statistical methods, protect data privacy rules, and are consistent, accessible, and clear to external stakeholders.
b. To adhere to these evaluation standards, NASA will:
(1) Conduct evaluations according to the standards identified in Office of Management and Budget (OMB) Memorandum M-20-12, Phase 4 Implementation of the Foundations for Evidence-Based Policymaking Act of 2018: Program Evaluation Standards and Practices (OMB M-20-12); and integrated with statistical policy and best practices from the Interagency Council on Statistical Policy (ICSP), OMB Statistical Standards and Programs and Confidential Information Protection and Statistical Efficiency Act (CIPSEA) standards.
(2) Abide by the standards in OMB M-20-12 of rigor; relevance and utility; objectivity and independence; ethics; and transparency. See Attachment C, Evaluation Standards, for definitions of these policy terms and their application at NASA.
a. This NPD is applicable to evaluations conducted at NASA and overseen by the Evaluation Officer.
b. This NPD applies to NASA's efforts to progress its vision of discovering and expanding knowledge for the benefit of humanity by using rigorous methods to study and evaluate the research, development, and modeling that supports the Agency's missions, programs, and strategic goals. NASA's Evaluation Policy is designed to ensure the Agency adheres to industry best practices and has the ability to assess the effectiveness and efficiency of its various initiatives. See Attachment C for statistical and evaluation standards.
c. This applies to NASA Headquarters and NASA Centers, including Component Facilities and Technical and Service Support Centers. This NPD also applies to the Jet Propulsion Laboratory, NASA's Federally Funded Research and Development Center, other contractors or parties to agreements only to the extent specified or referenced in the applicable contracts, grants, or agreements.
d. In this directive all mandatory actions (i.e., requirements) are denoted by statements containing the term "shall," the terms "may" or "can" denote discretionary privilege or permission, "should" denotes a good practice and is recommend but not required, "will" denotes expected outcome, and "are/is" denote descriptive material.
e. In this directive, all document citations are assumed to be the latest version, unless otherwise noted.
a. Federal Evidence-Building Activities, 5 U.S.C. ch. 3, subch. II.
b. Paperwork Reduction Act of 1995, 44 U.S.C. ch. 35.
a. Office of Management and Budget (OMB) Memorandum M-19-23, Phase 1 Implementation of the Foundations for Evidence-Based Policymaking Act of 2018: Learning Agendas, Personnel, and Planning Guidance (OMB M-19-23) (07/10/2019).
b. OMB Memorandum M-20-12, Phase 4 Implementation of the Foundations for Evidence- Based Policymaking Act of 2018: Program Evaluation Standards and Practices (OMB M-20-12) (03/10/2020).
c. OMB Implementation Guidance for Title V of the E-Government Act, Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA) (10/2016).
a. The Associate Administrator shall designate senior employees of NASA as the Evaluation Officer, the Statistical Official, and the Chief Data Officer, per OMB M-19-23.
b. The Office of the Chief Financial Officer, Strategic Investments Division Director, will be designated the Evaluation Officer, unless otherwise directed by the Associate Administrator. The Evaluation Officer, as designee of the Associate Administrator, shall:
(1) Oversee NASA's Learning Agenda and supporting evaluation activities.
(2) Review NASA's evaluation guidance for adherence to OMB M-19-23 and OMB M-20-12 evaluation standards.
(3) Collaborate with shaping and making contributions to other evidence-building functions within NASA and promote an effective evaluation capacity across the Agency.
(4) Collaborate with the Statistical Official and the Chief Data Officer to support effective data management; foster organizational culture of evaluation values; uphold applicable protocols for sensitive information, establish training requirements; oversee the terms and conditions for evaluation agreements; and coordinate among Agency officials to ensure evaluations comply with all aspects of data management.
c. The Statistical Official should collaborate with the Evaluation Officer to ensure that this policy is in alignment with applicable data management laws and policies and establish frameworks and mechanisms that ensure the creation, collection, use, processing, storage, maintenance, dissemination, disclosure, and disposition of data for evaluation is permissible, responsible, and appropriate.
d. The Chief Data Officer should collaborate with the Evaluation Officer to ensure that this policy is in alignment with applicable data management laws and policies and establish frameworks and mechanisms that ensure the creation, collection, use, processing, storage, maintenance, dissemination, disclosure, and disposition of data for evaluation is permissible, responsible, and appropriate.
e. The Evaluators should follow the evaluation standards in this policy as directed in OMB M-19-23 and OMB M-20-12 evaluation standards.
Attachment A: Definitions
Chief Data Officer: Officer with authority and responsibility for data governance and life-cycle data management.
Evaluation: An assessment using systematic data collection and analysis of one or more programs, policies, and organizations intended to assess their effectiveness and efficiency (OMB M-20-12).
Evaluation Officer: Officer with authority and responsibility for providing leadership over agencies' evaluation and Learning Agenda activities.
Evaluator: Federal staff and associated partners who are trained-through advanced education and evaluation experience (e.g., quantitative, qualitative and/or mixed-method evaluation specializations) to properly plan, implement, manage, and/or oversee evaluation activities and evaluations.
Statistical Official: Official with authority and responsibility for advising on statistical policy, techniques, and procedures.
Attachment B: References
B.1 Overview of Federal Evidence-Building Efforts by the Office of Management and Budget (OMB): https://obamawhitehouse.archives.gov/sites/default/files/omb/mgmt-gpra/overview_of_federal_evidence_building_efforts.pdf.
B.2 An Evaluation Roadmap For A More Effective Government by the American Evaluation Association (AEA): https://www.eval.org/evaluationroadmap.
B.3 Statistical Policy and Best Practices issued by the Interagency Council on Statistical Policy (ICSP): https://nces.ed.gov/FCSM/policies.asp.
B.4 Principles of Modernizing Production of Federal Statistics issued by the Interagency Council on Statistical Policy (ICSP): https://nces.ed.gov/FCSM/pdf/Principles.pdf.
Attachment C: Evaluation Standards
C.1 Rigor in NASA's evaluation and statistical practices requires ensuring that inferences about cause and effect are well founded (internal validity). It also requires clarity about the populations, settings, or circumstances to which results can be generalized (external validity); as well as the use of measures that accurately capture the intended information (measurement reliability and validity). NASA's evaluations will be conducted using the principles of rigor in both qualitative and quantitative evaluations.
C.2 NASA maintains an evaluation workforce with training and experience appropriate for planning and overseeing a rigorous evaluation portfolio, recruiting staff with appropriate qualifications. NASA also provides professional development opportunities so that staff can keep their evaluation and methodological skills current.
C.3 Relevance and utility will ensure evaluation priorities take into account legislative requirements and the interests and needs of leadership, missions, and programs; international partners; and space industry partners. Evaluations should be designed to address NASA's diverse missions, partnerships, and stakeholders in order to produce the most relevant and useful analysis; and should encourage diversity among those conducting the evaluations.
C.4 Independence and objectivity are core principles of evaluation and statistical analysis. Agency and program leadership, program staff, stakeholders, and others should participate in setting evaluation priorities, identifying evaluation questions, and assessing the implications of findings. However, it is important to insulate evaluation functions from undue influence and from both the appearance and the reality of bias. To promote objectivity, NASA protects independence in the design, conduct, and analysis of evaluations. After technical peer review, NASA has authority to approve, release, and disseminate evaluation reports.
C.5 NASA's evaluations will be conducted in an ethical manner and safeguard the dignity, rights, safety, and privacy of participants. Evaluations will comply with both the spirit and the letter of relevant requirements such as regulations governing research involving human subjects. Evaluations will be equitable, fair, and just, and take into account cultural and contextual factors that could influence the findings or their use.
C.6 Whenever possible and appropriate, NASA will make information about evaluations and findings from evaluations broadly available and accessible in the spirit of transparency. This information may include identifying the evaluator, releasing study plans, and describing the evaluation methods. As a non-statistical Agency, NASA will follow the appropriate CIPSEA guidelines for acquiring and handling data protected under CIPSEA.
(URL for Graphic)
This document does not bind the public, except as authorized by law or as incorporated into a contract. This document is uncontrolled when printed. Check the NASA Online Directives Information System (NODIS) Library to verify that this is the correct version before use: https://nodis3.gsfc.nasa.gov.