[NASA Logo]

NASA Procedures and Guidelines

This Document is Obsolete and Is No Longer Used.
Check the NODIS Library to access the current version:
http://nodis3.gsfc.nasa.gov


NPR 7123.1A
Eff. Date: March 26, 2007
Cancellation Date: April 29, 2013

NASA Systems Engineering Processes and Requirements w/Change 1 (11/04/09)

| TOC | ChangeLog | Preface | Chapter1 | Chapter2 | Chapter3 | Chapter4 | Chapter5 | Chapter6 | AppendixA | AppendixB | AppendixC | AppendixD | AppendixE | AppendixF | AppendixG | AppendixH | AppendixI | ALL |


Appendix H. Templates

H-1 Sample SE NPR Implementation Plan Template


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

SE NPR Implementation Plan

 

<Center Name>

 

 

 

 

 

 

 

 

 

 

 

 

 

Revision: <enter rev number>

<enter date>

 

 

National Aeronautics and Space Administration

 


 

SE NPR Implementation Plan

<Center>

<Date>

 

 

 

 

 

 

Prepared by:

 

 

 

 

 

 

 

Name

Date

 

 

 

 

 

 

 

Approved by:

 

 

 

 

 

 

 

Name

Center EMB Member

Date

 

 

 

 

 

 

 

 

Name

Office of Chief Engineer

Date

 


Change Record

 

 

 

Rev.

Date

Originator

Approvals

Description

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


Table of Contents

 

 

1.0   Introduction

1.1   purpose

1.2   Scope

1.3   Background

1.4   Designated governing Authority

2.0   Reference Documents

3.0   Compliance with SE NPR

3.1   Description of Center Compliance Methodology

3.2   Compliance Matrix

3.3   Plan to close gaps

4.0   Other

 

Appendix A Acronyms

Appendix B Glossary


Table of Figures

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Table of Tables

 

 

Table 3-1 SE NPR Compliance Matrix

 


1.0 Introduction

 

1.1   purpose

 

This document presents the organization's plan to implement the requirements of the System Engineering (SE) NPR.

 

1.2   Scope

 

The scope of this document contains the plan for demonstrating compliance with the SE NPR requirements.

 

1.3   Background

 

Describe basic product lines for the Center and the scope of application of NPR requirements.

 

1.4   Designated Governing Authority

 

This section describes the criteria or methodology that the Center will use to determine who the designated governing authority (DGA) will be for various classes or categories of projects performed at the Center. One philosophy might be, for example, for projects under $10 million, the DGA will be at the division level.

 

2.0 Reference Documents

 

Enter such documents as existing Center requirement documents and work instructions that reflect implementation of the requirements of the NPR.

 

3.0 Compliance with SE NPR

 

3.1   Description of Center Compliance Methodology

 

This section would include general textual descriptions on how the organization will approach compliance with the requirements in the SE NPR.

 

Definition of the population that these requirements apply to and how they will be trained at the Center would also be included in this section.

 

Estimates of the cost to implement these requirements may also be included in this section.

 

3.2   Compliance Matrix

 

Table 3-1 provides the cross-reference of the SE NPR requirements with Center documentation.

 

 

 

Table 3-1 SE NPR Compliance Matrix

 

Req

ID

SE NPR Section

Requirement Statement

Center Implementation Intent

Existing Center Docu-ment(s)/ Section

Compliance

Plan to Close Gap

Full

Partial

None

1

2.1.1.2

The OCE, under the authority of this SE NPR, shall ensure compliance with this SE NPR.

NA

NA

NA

NA

NA

2

2.1.1.3

For programs and projects involving more than one Center, the lead organization shall develop documentation to describe the hierarchy and reconciliation of Center plans implementing this NPR.

 

 

 

 

 

3

2.1.1.4

For systems that contain software, the technical team shall ensure that software developed within NASA or acquired complies with NPD 2820.1, NASA Software Policy, and NPR 7150.2, NASA Software Engineering Requirements.

None

 

 

X

Create a new work instruct-tion

4

2.1.1.5

The OCE shall be the clearinghouse for systems engineering policies to ensure compatibility across NASA.

NA

NA

NA

NA

NA

5

2.1.2.2.a

Center Directors shall perform the following activity or delegate it to the appropriate Center organization: develop the SE NPR Implementation Plan per the template in Appendix H-1 describing how the requirements of this SE NPR will be applied to the programs and projects under their cognizance or authority.

 

X

 

 

See this docu-ment.

6

2.1.2.2.b

Center Directors shall perform the following activities or delegate them to the appropriate Center organization: establish policies, procedures, and processes to execute the requirements of this SE NPR.

NPR 7120.3

 

X

 

Update Center PR.

7

2.1.2.2.c

Center Directors shall perform the following activities or delegate them to the appropriate Center organization: assess and take corrective actions to improve the execution of the requirements of this SE NPR.

Center Survey

 

X

 

See Center Survey.

8

2.1.2.2.d

Center Directors shall perform the following activities or delegate them to the appropriate Center organization: perform the SE NPR Center Survey in accordance with Appendix H-2 for the purpose of providing feedback on the SE NPR. The initial Center Survey will be submitted five months from the effective date of this SE NPR. Subsequent updates will be upon the request of the OCE, no earlier than nine months after the initial submission. The Center Survey will use the common survey tool in Appendix H-2 and will be submitted through the Center System Engineering Working Group (SEWG) representative.

Center Survey

X

 

 

See Center Survey

9

2.1.2.2.e

Center Directors shall perform the following activity or delegate it to the appropriate Center organization: select appropriate standards applicable to projects under their control.

 

 

 

 

 

10

2.1.3

Each technical team shall execute the Center processes intended to implement this SE NPR under the oversight of the Center Directors in accordance with the SEMP.

 

 

 

 

 

11

2.2.1.2

The Center Directors shall submit their SE NPR Implementation Plan to the OCE within three months after the effective date of this NPR.

Implemen-tation Plan

X

 

 

See Imple-menta-tion Plan

12

2.2.1.3

The Center Directors shall develop and document in the SE NPR Implementation Plan how the particular Center will assess compliance to the SE NPR and provide regular updates to the OCE.

Implemen-tation Plan

X

 

 

See Im-plementation Plan last sub-mitted [date]

13

2.3.1.1

The appropriate DGA shall have responsibility to approve or disapprove any SE NPR requirement that is either tailored or waived.

 

 

 

 

 

14

3.1.3

The assigned technical teams shall define in the project SEMP how the required 17 common technical processes, as implemented by Center documentation, will be applied to the various levels of project WBS model system structure during each applicable life-cycle phase and have their approach approved by the DGA.

 

 

 

 

 

15

3.2.1.1

The Center Directors or designees shall establish and maintain a process to include activities, requirements, guidelines, and documentation for the definition of stakeholder expectations for the applicable WBS model.

Example: JPR 7120.3, Section xxx

 

X

 

Update section

16

3.2.2.1

The Center Directors or designees shall establish and maintain a process to include activities, requirements, guidelines, and documentation for definition of the technical requirements from the set of agreed upon stakeholder expectations for the applicable WBS model.

Example: JPR 7120.3, Section xxx

 

X

 

Update section

17

3.2.3.1

The Center Directors or designees shall establish and maintain a process to include activities, requirements, guidelines, and documentation for logical decomposition of the validated technical requirements of the applicable WBS.

Example: JPR 7120.3, Section xxx

X

 

 

 

18

3.2.4.1

The Center Directors or designees shall establish and maintain a process to include activities, requirements, guidelines, and documentation for designing product solution definitions within the applicable WBS model that satisfy the derived technical requirements.

Example: JPR 7120.3, Section xxx

 

X

 

Update section

19

3.2.5.1

The Center Directors or designees shall establish and maintain a process to include activities, requirements, guidelines, and documentation for implementation of a design solution definition by making, buying, or reusing an end product of the applicable WBS model.

Example: JPR 7120.3, Section xxx

 

X

 

Update section

20

3.2.6.1

The Center Directors or designees shall establish and maintain a process to include activities, requirements, guidelines, and documentation for the integration of lower level products into an end product of the applicable WBS model in accordance with its design solution definition.

Example: JPR 7120.3, Section xxx

 

X

 

Update section

21

3.2.7.1

The Center Directors or designees shall establish and maintain a process to include activities, requirements, guidelines, and documentation for verification of end products generated by the product implementation process or product integration process against their design solution definitions.

Example: JPR 7120.3, Section xxx

 

X

 

Update section

22

3.2.8.1

The Center Directors or designees shall establish and maintain a process to include activities, requirements, guidelines, and documentation for validation of end products generated by the product implementation process or product integration process against their stakeholder expectations.

None

 

 

X

Add a new section to JPR 7120.3

23

3.2.9.1

The Center Directors or designees shall establish and maintain a process to include activities, requirements, guidelines, and documentation for transitioning end products to the next higher level WBS-model customer or user.

Example: JPR 7120.3, Section xxx

 

X

 

Update section

24

3.2.10.1

The Center Directors or designees shall establish and maintain a process to include activities, requirements, guidelines, and documentation for planning the technical effort.

Example: JPR 7120.3, Section xxx

 

X

 

Update section

25

3.2.11.1

The Center Directors or designees shall establish and maintain a process to include activities, requirements, guidelines, and documentation for management of requirements defined and baselined during the application of the system design processes.

Example: JPR 7120.3, Section xxx

X

 

 

No action needed

26

3.2.12.1

The Center Directors or designees shall establish and maintain a process to include activities, requirements, guidelines, and documentation for management of the interfaces defined and generated during the application of the system design processes.

Example: JPR 7120.3, Section xxx

 

X

 

Update section

27

3.2.13.1

The Center Directors or designees shall establish and maintain a process to include activities, requirements, guidelines, and documentation, for management of the technical risk identified during the technical effort. (NPR 8000.4, Risk Management Procedural Requirements, is to be used as a source document for defining this process, and NPR 8705.5, Probabilistic Risk Assessment (PRA) Procedures for NASA Programs and Projects, provides one means of identifying and assessing technical risk.)

Example: NPR 8000.4 referenced in JPR 7120.3.

X

 

 

 

28

3.2.14.1

The Center Directors or designees shall establish and maintain a process to include activities, requirements, guidelines, and documentation for configuration management.

Example: JPR 7120.3, Section xxx

 

X

 

Update section

29

3.2.15.1

The Center Directors or designees shall establish and maintain a process to include activities, requirements, guidelines, and documentation for management of the technical data generated and used in the technical effort.

Example: JPR 7120.3, Section xxx

 

X

 

Update section

30

3.2.16.1

The Center Directors or designees shall establish and maintain a process to include activities, requirements, guidelines, and documentation for making assessments of the progress of planned technical effort and progress toward requirements satisfaction.

Example: JPR 7120.3, Section xxx

 

X

 

Update section

31

3.2.17.1

The Center Directors or designees shall establish and maintain a process to include activities, requirements, guidelines, and documentation for making technical decisions.

Example: JPR 7120.3, Section xxx

X

 

 

 

32

4.2.1

The assigned NASA technical team shall prepare a SEMP that covers the periods before contract award, during contract performance, and upon contract completion in accordance with content contained in the annotated outline in Appendix D.

 

 

 

 

 

33

4.2.2

The assigned technical team shall use common technical processes, as implemented by the Center's documentation, to establish the technical inputs to the Request for Proposal (RFP) appropriate for the product to be developed, including product requirements and Statement of Work tasks.

 

 

 

 

 

34

4.2.3

The technical team shall determine the technical work products to be delivered by the offeror or contractor to include a contractor SEMP that specifies the systems engineering approach for requirements development; technical solution definition; design realization; product evaluation; product transition; and technical planning, control, assessment, and decision analysis.

 

 

 

 

 

35

4.2.4

The technical team shall provide to the contracting officer, for inclusion in the RFP, the requirements for technical oversight activities planned in the NASA SEMP. (Care should be taken that no requirements or solicitation information is divulged prior to the release of the solicitation by the cognizant contracting officer.)

 

 

 

 

 

36

4.2.5

The technical team shall participate in the evaluation of offeror proposals following applicable NASA and Center source selection procedures.

 

 

 

 

 

37

4.3.1

The assigned technical team, under the authority of the cognizant contracting officer, shall perform the technical oversight activities established in the NASA SEMP.

 

 

 

 

 

38

4.4.1

The assigned technical team shall participate in scheduled milestone reviews to finalize Government acceptance of the deliverables.

 

 

 

 

 

39

4.4.2

The assigned technical team shall participate in product transition to the customer and/or disposal, as defined in the NASA SEMP.

 

 

 

 

 

40

5.2.1.2

Technical teams shall monitor technical effort through periodic technical reviews.

 

 

 

 

 

41

5.2.1.6

The technical team shall ensure that system aspects represented or implemented in software are included in all technical reviews to demonstrate that project technical goals and progress are being achieved and that all NPR 7150.2 software review requirements are implemented.

 

 

 

 

 

42

5.2.2

The technical team shall develop and document plans for technical reviews for use in the project planning process. The technical review schedule, as documented in the SEMP, will be reflected in the overall project plan described in NPR 7120.5. The results of each technical review will be used to update the technical review plan as part of the SEMP update process. The review plans, data, and results should be maintained and dispositioned as Federal records.

 

 

 

 

 

43

5.3.1.2

The technical team shall address the entrance and success criteria listed in Appendix G for applicability to the respective reviews.

 

 

 

 

 

44

5.3.1.3

The technical team shall execute the required Program/System Requirements Review (P/SRR) and Program Approval Review (PAR) in accordance with the review entry and success criteria defined in tables G-1 and G-2 of Appendix G.

 

 

 

 

 

45

5.3.1.4

The technical team shall execute the required program technical reviews in accordance with the following timeline: P/SRR before KDP 0 and PAR before KDP 1.

 

 

 

 

 

46

5.3.1.5

For human FS&GS projects, the technical team shall execute the following required minimum set of technical reviews in accordance with the review entry and success criteria defined in tables G-3, G-4, G-6, G-7, G-8, and G-10 through G-18 of Appendix G: Mission Concept Review (MCR), System Requirements Review (SRR), System Definition Review (SDR), Preliminary Design Review (PDR), Critical Design Review (CDR), System Integration Review (SIR), Test Readiness Review (TRR), System Acceptance Review (SAR), Operational Readiness Review (ORR), Flight Readiness Review (FRR), Post-Launch Assessment Review (PLAR), Critical Event Readiness Review (CERR), Post-Flight Assessment Review (PFAR), and Decommissioning Review (DR). (For more information on program and project life cycles and management reviews, see the appropriate NPR, e.g., NPR 7120.5.)

 

 

 

 

 

47

5.3.1.6

For robotic FS&GS projects, the technical team shall execute and document the following minimum required technical reviews: the MCR, SRR, Mission Definition Review (MDR), PDR, CDR, SIR, TRR, ORR, FRR, PLAR, CERR, and DR in accordance with the review entry and success criteria given in tables G-3, G-4, G-5, G-7, G-8, G-10, G-11, G-13 through G-16, and G-18 of Appendix G. Robotic projects can combine the SRR and MDR based on size and level of risk. If the two reviews are conducted separately, Table G-4 will be used for the SRR and Table G-5 will be used for the MDR. If the two reviews are combined, the entrance and success criteria for both SRR and MDR will be combined for this single review.

 

 

 

 

 

48

5.3.1.7

The technical team shall also execute a Production Readiness Review (PRR) as an additional technical review for both human and robotic FS&GS projects developing or acquiring multiple or similar systems greater than three (or as determined by the project) in accordance with the review entry and success criteria defined in Table G-9 of Appendix G. Any project producing end products with three or less units will still perform the required CDR. The CDR will include production considerations when a PRR is not performed.

 

 

 

 

 

49

5.3.1.8

The technical team shall execute the required FS&GS project technical reviews in accordance with the following timelines:

a.       MCR prior to KDP A.

b.       Human FS&GS project SRR prior to SDR, and robotic missions SRR and MDR prior to KDP B.

c.       Human FS&GS project SDR prior to KDP B.

d.       PDR prior to KDP C.

e.       CDR prior to starting fabrication of system end products and SIR.

f.         PRR prior to starting fabrication of system end products for projects requiring multiple units.

g.       SIR prior to KDP D.

h.       TRR prior to starting product verification and product validation testing.

i.         Human FS&GS project SAR after completion of KDP D.

j.         ORR after SAR or KDP D and before FRR.

k.       FRR prior to KDP E.

l.         PLAR after system end product launch.

m.     CERR after PLAR and before KDP F.

n.       Human FS&GS project PFAR at end of flight and before KDP F.

o.       DR after KDP F.

 

 

 

 

 

50

5.3.1.9

The assigned technical team shall accomplish the monitoring function for flight-related ATD projects using appropriately defined and conducted periodic technical reviews (PTR) and continuation reviews (CRs). (See Figure 5-3.)

 

 

 

 

 

51

5.3.1.10

The assigned technical team shall accomplish the monitoring function for IPs using PTR and SAR. (See Figure 5-3.)

 

 

 

 

 

52

6.2.1

Working with the program/project manager, the technical team shall determine the appropriate level within the system structure at which SEMPs are developed, taking into account factors such as number and complexity of interfaces, operating environments, and risk factors.

 

 

 

 

 

53

6.2.2

The technical team shall baseline the SEMP per the Center's Implementation Plan, incorporating the content contained in Appendix D, Systems Engineering Management Plan, prior to completion of Phase A in the program life cycle or the equivalent milestone.

 

 

 

 

 

54

6.2.3

The DGA shall review and approve or disapprove the SEMP at each major milestone review or its equivalent.

 

 

 

 

 

55

6.2.4

The assigned technical team shall establish the initial SEMP early in the Formulation phase and update it as necessary to reflect changes in scope or improved technical development.

 

 

 

 

 

56

6.2.5

The technical team shall ensure that any technical and discipline plans describe how the technical activities covered in the plans are consistent with the SEMP and are accomplished as fully integrated parts of the technical effort.

 

 

 

 

 

57

6.2.6

The technical team shall ensure that the project's software development/ management plan describes how the software activities are consistent with the SEMP and are accomplished as fully integrated parts of the technical effort.

 

 

 

 

 

 

 

 

 

3.3   Plan to close gaps

 

This section would include textual descriptions about how the gaps noted in the matrix will be closed.

 

 

 

 

 

 

 

4.0 Other

 

 

 

 

 

 



H-2 SE NPR Center Survey


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

SE NPR Center Survey

 

<Center Name>

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Revision: <enter rev number>

<enter date>

 

National Aeronautics and Space Administration

 


 

SE NPR Center Survey

<Center>

<Date>

 

 

 

 

 

 

Prepared by:

 

 

 

 

 

 

 

Name

Date

 

 

 

 

 

 

 

Approved by:

 

 

 

 

 

 

 

Name

 

Date

 

 

 

 

 

 

 

 

Name

 

Date

 


Change Record

 

 

 

Rev.

Date

Originator

Approvals

Description

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


Table of Contents

 

1.0   Introduction

1.1   purpose

1.2   Scope

1.3   Background

2.0   Applicable Documents

3.0   Planned Activities

3.1   Description of Center-Equivalent Activities

3.2   Traceability Matrix

3.3   Plan to Close Gaps

4.0   Lessons Learned

5.0   Center Best Practices

6.0   Other

 

Appendix A Acronyms

Appendix B Glossary

 

 

Table of Figures

 

 

 

 

Table of Tables

 

Table 3-1 Traceability Matrix

 


1.0 Introduction

 

1.1   purpose

 

This document presents the organization's survey for implementing the best practice activities as described in Appendix C of the System Engineering NPR.

 

1.2   Scope

 

The scope of this document contains the plan and traceability for implementing the best practice activities.

 

1.3   Background

 

Describe basic product lines for the Center and the scope of application of NPR activities.

 

2.0 Reference Documents

 

List documents such as existing Center requirement documents or work instructions that reflect implementation of the NPR activities.

 

3.0 Planned Activities

 

3.1   Description of Center-Equivalent Activities

 

This section would include general textual descriptions on the activities used to accomplish the processes at the Center.

 

3.2   Traceability Matrix

 

Table 3-1 provides the cross-reference of the expected process activities listed in Appendix C of the SE NPR with equivalent activities in or planned for Center documentation.

 


Table 3-1 Process Activity Traceability Matrix

 

No.

NPR Process

Expected Process Activities

Center Implementation

Center Document(s)/ Section/ Task

Fully In-cluded

Partial-ly In-cluded

Gap

Plan to close gap

1

Stakeholder Expecta-tions Definition Process

a). Establish a list that identifies customers and other stakeholders that have an interest in the system and its products.

Example:

JPR 7120.3, Section xxx

 

x

 

Update section

b). Elicit customer and other stakeholder expectations (needs, wants, desires, capabilities, external interfaces, and constraints) from the identified stakeholders.

None

 

 

X

Create a new work instruction

c). Establish operational concepts and support strategies based on stakeholders' expected use of the system products over the system's life.

JPR 7120.3, Section xxx

x

 

 

 

d). Define stakeholder expectations in acceptable statements that are complete sentences and have the following characteristics: (1) individually clear, correct, and feasible to satisfy; not stated as to how it is to be satisfied; implementable; only one interpretation of meaning; one actor-verb-object expectation; and can be validated at the level of the system structure at which it is stated; and (2) in pairs or as a set there is an absence of redundancy, consistency with respect to terms used, are not in conflict with one another, and do not contain stakeholder expectations of questionable utility or which have an unacceptable risk of satisfaction.

JPR 7120.3, Section xxx

 

 

X

None

e). Analyze stakeholder expectation statements to establish a set of measures (measures of effectiveness) by which overall system or product effectiveness will be judged, and customer satisfaction will be determined.

 

 

 

 

 

f). Validate that the resulting set of stakeholder expectation statements are upward and downward traceable to reflect the elicited set of stakeholder expectations and that any anomalies identified are resolved.

 

 

 

 

 

g). Obtain commitments from customer and other stakeholders that the resultant set of stakeholder expectation statements is acceptable.

 

 

 

 

 

h). Baseline the agreed to set of stakeholder expectation statements.

 

 

 

 

 

2

Technical Require-ments Definition Process

a). Analyze the scope of the technical problem to be solved to identify and resolve the design boundary that identifies: (1) which system functions are under design control and which are not; (2) expected interaction among system functions (data flows, human responses, and behaviors); (3) external physical and functional interfaces (mechanical, electrical, thermal, data, procedural) with other systems; (4) required capacities of system products; (5) timing of events, states, modes, and functions related to operational scenarios; and (6) emerging or maturing technologies necessary to make requirements.

 

 

 

 

 

b). Define constraints affecting the design of the system or products or how the system or products will be able to be used.

 

 

 

 

 

c). Define functional and behavioral expectations for the system or product in acceptable technical terms for the range of anticipated uses of system products as identified in the concept of operations; this permits separating defined stakeholder expectation functions and behaviors that belong to a lower level in the system structure and allocating them to the appropriate level.

 

 

 

 

 

d). Define the performance requirements associated with each defined functional and behavioral expectation.

 

 

 

 

 

e). Define technical requirements in acceptable "shall" statements that are complete sentences with a single "shall" per numbered statement and have the following characteristics: (1) individually clear, correct, and feasible; not stated as to how it is to be satisfied; implementable; only one interpretation of meaning; one actor-verb-object requirement; and can be validated at the level of the system structure at which it is stated; and (2) in pairs or as a set, there is an absence of redundancy, consistency with terms used, no conflict with one another, and form a set of "design-to" requirements.

 

 

 

 

 

f). Validate that the resulting technical requirement statements: (1) have bidirectional traceability to the baselined stakeholder expectations; (2) were formed using valid assumptions; and (3) are essential to and consistent with designing and realizing the appropriate product solution form that will satisfy the applicable product-line life-cycle phase exit criteria.

 

 

 

 

 

 

 

g). Define MOPs for each identified measure of effectiveness (MOE) that cannot be directly used as a design-to technical requirement.

 

 

 

 

 

h). Define appropriate TPMs by which technical progress will be assessed.

 

 

 

 

 

i). Establish the technical requirements baseline.

 

 

 

 

 

3

Logical Decomposi-tion Process

a). Define one or more logical decomposition models based on the defined technical requirements to gain a more detailed understanding and definition of the design problem to be solved.

 

 

 

 

 

b). Allocate the technical requirements to the logical decomposition models to form a set of derived technical requirement statements that have the following characteristics:

(1) describe functional and performance, service and attribute, time, and data flow requirements, etc., as appropriate for the selected set of logical decomposition models;
(2) individually are complete sentences and are clear, correct, and feasible; not stated as to how to be satisfied; implementable; only have one interpretation of meaning, one actor-verb-object expectation; and can be validated at the level of the system structure at which it is stated;

(3) in pairs or as a set, have an absence of redundancy, are adequately related with respect to terms used, and are not in conflict with one another; and

(4) form a set of detailed "design-to" requirements.

 

 

 

 

 

c). Resolve derived technical requirement conflicts.

 

 

 

 

 

d). Validate that the resulting set of derived technical requirements have: (1) bidirectional traceability with the set of validated technical requirements and (2) assumptions and decision rationales consistent with the source set of technical requirements.

 

 

 

 

 

e). Establish the derived technical requirements baseline.

 

 

 

 

 

4

Design Solution Definition Process

a). Define alternative solutions for the system end product being developed or improved that are consistent with derived technical requirements and nonallocated technical requirements, if any.

 

 

 

 

 

 

 

 

b). Analyze each alternative solution against defined criteria, such as satisfaction of external interface requirements; technology requirements; off-the-shelf availability of products; physical failure modes, effects, and criticality; life-cycle cost and support considerations; capacity to evolve; make vs. buy; standardization of products; integration concerns; and context of use issues of operators considering tasks, location, workplace equipment, and ambient conditions.

 

 

 

 

 

c). Select the best solution alternative based on the analysis results of each alternative solution and technical decision analysis recommendations.

 

 

 

 

 

d). Generate the full design description of the selected alternative solution in a form appropriate to the product-line life-cycle phase, location of the WBS model in the system structure, and phase exit criteria to include: (1) system specification and external interface specifications; (2) end product specifications, configuration description documents, and interface specifications; (3) end product subsystem initial specifications, if subsystems are required; (4) requirements for associated supporting enabling products; (5) end product verification plan; (6) end product validation plan; and (7) applicable logistics and operate-to procedures.

 

 

 

 

 

e). Verify that the design solution definition: (1) is realizable within constraints imposed on the technical effort; (2) has specified requirements that are stated in acceptable statements and have bidirectional traceability with the derived technical requirements, technical requirements, and stakeholder expectations; and (3) has decisions and assumptions made in forming the solution consistent with its set of derived technical requirements, separately allocated technical requirements, and identified system product and service constraints.

 

 

 

 

 

f). Baseline the design solution definition specified requirements including the specifications and configuration descriptions.

 

 

 

 

 

 

 

g). Initiate development or acquisition of the life-cycle supporting enabling products needed, as applicable, for research, development, fabrication, integration, test, deployment, operations, sustainment, and disposal.

 

 

 

 

 

h). Initiate development of the system products of the next lower level WBS model, if any.

 

 

 

 

 

5

Product Implementa-tion Process

a). Prepare to conduct product implementation including: (1) prepare a product implementation strategy and detailed planning and procedures and (2) determine whether the product configuration documentation is adequately complete to conduct the type of product implementation as applicable for the product-line life-cycle phase, location of the product in the system structure, and phase exit criteria.

 

 

 

 

 

b). If the strategy is for buying an existing product, participate in the buy of the product including: (1) review the technical information made available by vendors; (2) assist the preparation of requests for acquiring the product from a vendor; (3) assist the inspection of the delivered product and the accompanying documentation; (4) determine whether the vendor conducted product validation or if it will need to be done by a project technical team; and (5) determine the availability of enabling products to provide test, operations, and maintenance support and disposal services for the product.

 

 

 

 

 

c). If the strategy is to reuse a product that exists in the Government inventory, participate in acquiring the reused product including: (1) review the technical information made available for the specified product to be reused; (2) determine supporting documentation and user manuals availability; (3) determine the availability of enabling products to provide test, operations, and maintenance support and disposal services for the product; (4) assist the requests for acquiring the product from Government sources; and (5) assist the inspection of the delivered product and the accompanying documentation.

 

 

 

 

 

 

 

d). If the strategy is to make the product, (1) evaluate the readiness of the product implementation enabling products to make the product, (2) make the specified product in accordance with the specified requirements, configuration documentation, and applicable standards, and (3) prepare appropriate product support documentation, such as integration constraints and/or special procedures for performing product verification and product validation.

 

 

 

 

 

e). Capture work products and related information generated while performing the product implementation process activities.

 

 

 

 

 

6

Product Integration Process

a). Prepare to conduct product integration to include: (1) preparing a product integration strategy, detailed planning for the integration, and integration sequences and procedures; and (2) determining whether the product configuration documentation is adequately complete to conduct the type of product integration applicable for the product-line life-cycle phase, location of the product in the system structure, and management phase exit criteria.

 

 

 

 

 

b). Obtain lower level products required to assemble and integrate into the desired product.

 

 

 

 

 

c). Confirm that the received products that are to be assembled and integrated have been validated to demonstrate that the individual products satisfy the agreed upon set of stakeholder expectations, including interface requirements.

 

 

 

 

 

d). Prepare the integration environment in which assembly and integration will take place to include evaluating the readiness of the product-integration enabling products and the assigned workforce.

 

 

 

 

 

e). Assemble and integrate the received products into the desired end product in accordance with the specified requirements, configuration documentation, interface requirements, applicable standards, and integration sequencing and procedures.

 

 

 

 

 

 

 

f). Prepare appropriate product support documentation, such as special procedures for performing product verification and product validation.

 

 

 

 

 

g). Capture work products and related information generated while performing the product integration process activities.

 

 

 

 

 

 

7

 

Product Verification Process

          

a). Prepare to conduct product verification to include as applicable to the product-line life-cycle phase and WBS model location in the system structure: (1) reviewing the product verification plan for specific procedures, constraints, conditions under which verification will take place, pre- and post-verification actions, and criteria for determining the success or failure of verification methods and procedures; (2) arranging the needed product-verification enabling products and support resources; (3) obtaining the end product to be verified; (4) obtaining the specification and configuration baseline against which the verification is to be made; and (5) establishing and checking the verification environment to ensure readiness for performing the verification.

 

 

 

 

 

b). Perform the product verification in accordance with the product verification plan and defined procedures to collect data on each specified requirement with specific attention given to MOPs.

 

 

 

 

 

c). Analyze the outcomes of the product verification, including identifying verification anomalies, establishing recommended corrective actions, and establishing conformance to each specified requirement under controlled conditions.

 

 

 

 

 

d). Prepare a product verification report providing the evidence of product conformance with the applicable design solution definition specified requirements baseline to which the product was generated, including bidirectional requirements traceability and actions taken to correct anomalies of verification results.

 

 

 

 

 

e). Capture the work products from the product verification.

 

 

 

 

 

 

8

         Product Validation Process

a). Prepare to conduct product validation to include as applicable to the product-line life-cycle phase and product location in the system structure: (1) reviewing the product validation plan for specific procedures, constraints, conditions under which validation will take place, pre- and post-validation actions, and criteria for determining the success or failure of validation methods and procedures; (2) arranging the needed product-validation enabling products and support resources; (3) obtaining the end product to be validated; (4) obtaining the stakeholder expectations baseline against which the validation is to be made; and (5) establishing and evaluating the validation environment to ensure readiness for performing the validation.

 

 

 

 

 

b). Perform the product validation in accordance with the product validation plan and defined procedures to collect data on performance of the product against stakeholder expectations with specific attention given to MOEs.

 

 

 

 

 

c). Analyze the outcomes of the product validation to include identification of validation anomalies, establishing recommended corrective actions, and establishing conformance to stakeholder expectations under operational conditions (actual, analyzed, or simulated).

 

 

 

 

 

d). Prepare a product validation report providing the evidence of product conformance with the stakeholder expectations baseline, including corrective actions taken to correct anomalies of validation results.

 

 

 

 

 

e). Capture the work products from the product validation.

 

 

 

 

 

 

9

 

Product Transition Process

          

 

a). Prepare to conduct product transition to include: (1) preparing a product implementation strategy to establish the type of product transition to be made (to the next higher level customer for product integration or to an end user); and (2) reviewing related end product stakeholder expectations and design solution definition specified requirements to identify special transition procedures and enabling product needs for the type of product transition, if any, for packaging, storage, handling, shipping/transporting, site preparation, installation, or sustainment.

 

 

 

 

 

b). Evaluate the end product, personnel, and enabling product readiness for product transition including: (1) availability and appropriateness of the documentation that will be packaged and shipped with the end product; (2) adequacy of procedures for conducting product transition; (3) availability and skills of personnel to conduct product transition; and (4) availability of packaging materials/containers, handling equipment, storage facilities, and shipping/transporter services.

 

 

 

 

 

c). Prepare the end product for transition to include the packaging and moving the product to the shipping/transporting location and any intermediate storage.

 

 

 

 

 

d). Transition the end product with required documentation to the customer, based on the type of transition required, e.g., to the next higher level WBS model for product integration or to the end user.

 

 

 

 

 

e). Prepare sites, as required, where the end product will be stored, assembled, integrated, installed, used, or maintained, as appropriate for the life-cycle phase, position of the end product in the system structure, and customer agreement.

 

 

 

 

 

f). Capture work products from product transition process activities.

 

 

 

 

 

10

Technical Planning Process

a). Prepare to conduct technical planning to include: (1) preparing or updating a planning strategy for each of the common technical processes of this SE NPR and (2) determining: (a) deliverable work products from technical efforts, (b) technical reporting requirements, (c) other technical information needs for reviews or satisfying product-line life-cycle management phase entry or exit criteria, (d) product and process measures to be used in measuring technical performance, cost, and schedule progress, (e) key or critical technical events with entry and success criteria, (f) data management approach for data collection and storage and how measurement data will be analyzed, reported, and dispositioned as Federal records, (g) technical risks that need to be addressed in the planning effort, (h) tools and engineering methods to be employed in the technical effort, and (i) approach to acquiring and maintaining the technical expertise needed (training and skills development plan).

 

 

 

 

 

b). Define the technical work to be done to include associated technical, support, and management tasks needed to generate the deliverable products and satisfy entry and success criteria of key technical events and the applicable product-line life-cycle management phase.

 

 

 

 

 

c). Schedule, organize, and determine the cost of the technical effort.

 

 

 

 

 

d). Prepare the SEMP and other technical plans needed to support the technical effort and perform the technical processes.

 

 

 

 

 

e). Obtain stakeholder commitments to the technical plans.

 

 

 

 

 

f). Issue authorized technical work directives to implement the technical work.

 

 

 

 

 

g). Capture work products from technical planning activities.

 

 

 

 

 

 

11

Require-ments Manage-ment Process

a). Prepare to conduct requirements management to include: (1) preparing or updating a strategy and procedures for: (a) establishing that expectation and requirement statements, singularly and as a whole, are prepared in accordance with established formats and rules; (b) identifying expectations and requirements to be managed, expectation and requirement sources, and allocation and traceability of requirements and linking product expectations and requirements with costs, weight, and power allocations, as applicable; and (c) formal initiation, assessment, review, approval, and disposition of engineering change proposals and changes to expectation and requirements baseline; (2) selecting or updating an appropriate requirements management tool; and (3) training technical team members in the established requirements management procedures and in the use of the selected/updated requirements management tool.

 

 

 

 

 

b). Conduct requirements management to include: (1) capturing, storing, and documenting the expectations and requirements; (2) establishing that expectation and requirement statements are compliant with format and other established rules; (3) confirming each established requirements baseline has been validated; and (4) identifying and analyzing out-of-tolerance system-critical technical parameters and unacceptable validation and verification results and proposing requirement-appropriate changes to correct out-of-tolerance requirements.

 

 

 

 

 

c). Conduct expectation and requirements traceability to include: (1) tracking expectations and requirements between baselines, especially MOEs, MOPs, and TPMs and (2) establishing and maintaining appropriate requirements compliance matrixes that contain the requirements, bidirectional traceability, compliance status, and any actions to complete compliance.

 

 

 

 

 

d). Manage expectation and requirement changes to include: (1) reviewing engineering change proposals (ECPs) to determine any changes to established requirement baselines; (2) implementing formal change procedures for proposed and identified expectation or requirement changes; and (3) disseminating the approved change information.

 

 

 

 

 

e). Capture work products from requirements management process activities to include maintaining and reporting information on the rationale for and disposition and implementation of change actions, current requirement compliance status, and expectation and requirement baselines.

 

 

 

 

 

 

12

Interface Manage-ment Process

 

a). Prepare or update interface management procedures for: (1) establishing interface management responsibilities for those interfaces that are part of agreement boundaries; (2) maintaining and controlling identified internal and external physical and functional interfaces; (3) preparing and maintaining appropriate physical and functional interface specifications or interface control documents and drawings to describe and control interfaces external to the system end product; (4) identifying interfaces between system products (including humans) and among configuration management items; (5) establishing and implementing formal change procedures for interface evolution; (6) disseminating the needed interface information for integration into technical effort activities and for external interface control; and (7) training technical teams and other applicable support and management personnel in the established interface management procedures.

 

 

 

 

 

b). Conduct interface management during system design activities for each WBS model in the system structure to include: (1) integrating the interface management activities with requirements management activities; (2) analyzing the concept of operations to identify critical interfaces not included in the stakeholder set of expectations; (3) documenting interfaces both external and internal to each WBS model as the development of the system structure emerges and interfaces are added and existing interfaces are changed; (4) documenting origin, destination, stimulus, and special characteristics of interfaces; (5) maintaining the design solution definition for internal horizontal and vertical interfaces between WBS models in the system structure; (6) maintaining horizontal traceability of interface requirements across interfaces and capturing status in the established requirements compliance matrix; and (7) confirming that each interface control document or drawing that is established has been validated with parties on both sides of the interface.

 

 

 

 

 

c). Conduct interface management during product integration activities to include: (1) reviewing product integration procedures to ensure that interfaces are marked to ensure easy and correct assembly/connection with other products; (2) identifying product integration planning to identify interface discrepancies, if any, and report to the proper technical team or technical manager; (3) confirming that a precheck is completed on all physical interfaces before connecting products; (4) evaluating assembled products for interface compatibility; (5) confirming that product verification and product validation plans/procedures include confirming internal and external interfaces; and (6) preparing an interface evaluation report upon completion of integration, product verification, and product validation.

 

 

 

 

 

d). Conduct interface control to include: (1) managing interface changes within the system structure; (2) identifying and tracking proposed and directed changes to interface specifications and interface control documents and drawings; (3) confirming that the vertical and horizontal interface issues are analyzed and resolved when a change affects products on both sides of the interface; (4) controlling traceability of interface changes including source of the change, processing methods, and approvals; and (5) disseminating the approved interface change information for integration into technical efforts at every level of the project.

 

 

 

 

 

 

 

e). Capture work products from interface management activities.

 

 

 

 

 

 

13

Technical Risk Manage-ment Process

a). Prepare a strategy to conduct technical risk management to include: (1) documenting how the project risk management plan will be implemented in the technical effort; (2) planning identification of technical risk sources and categories; (3) identification of potential technical risks; (4) characterizing and prioritizing technical risks; (5) planning informed technical management (mitigation) actions should the risk event occur; (6) tracking technical risk status against established triggers; (7) resolving technical risk by taking planned action if established triggers are tripped; and (8) communicating technical risk status and mitigation actions taken, when appropriate.

 

 

 

 

 

b). Identify technical risks to include: (1) identifying sources of risk issues related to the technical effort; (2) anticipate what could go wrong in each of the source areas to create technical risk issues; (3) analyzing identified technical risks for cause and importance; (4) preparing clear, understandable, and standard form risk statements; and (5) coordinating with relevant stakeholders associated with each identified technical risk.

 

 

 

 

 

c). Conduct technical risk assessment to include: (1) categorize the severity of consequences for each identified technical risk in terms of performance, cost, and schedule impacts to the technical effort and project; (2) analyze the likelihood and uncertainties of events associated with each technical risk and quantify (for example, by probabilities) or qualify (for example, by high, moderate, or low) the probability of occurrence in accordance with project risk management plan rules; and (3) prioritize risks for mitigation.

 

 

 

 

 

d). Prepare for technical risk mitigation to include: (1) selecting risks for mitigation and monitoring; (2) selecting an appropriate risk-handling approach; (3) establishing the risk level or threshold when risk occurrence becomes unacceptable and triggers execution of a risk mitigation action plan; (4) selecting contingency actions and triggers should risk mitigation not work to prevent a problem occurrence; (5) preparing risk mitigation and contingency action plans identifying responsibilities and authorities.

 

 

 

 

 

e). Monitor the status of each technical risk periodically to include: (1) tracking risk status to determine whether conditions or situations have changed so that risk monitoring is no longer needed or new risks have been discovered; (2) comparing risk status and risk thresholds; (3) reporting risk status to decision authorities when a threshold has been triggered and an action plan implemented; (4) preparing technical risk status reports as required by the project risk management plan; (5) communicating risk status during technical reviews in the form specified by the project risk management plan.

 

 

 

 

 

f). Implement technical risk mitigation and contingency action plans when the applicable thresholds have been triggered to include: (1) monitoring the results of the action plan implemented; (2) modifying the action plan as appropriate to the results of the actions; (3) continuing actions until the residual risk and/or consequences impacts are acceptable or become a problem to be solved; (4) communicate to the project when risks are beyond the scope of the technical effort to control, will affect a product higher in the system structure, or represent a significant threat to the technical effort or project success; (5) preparing action plan effectiveness reports as required by the project risk management plan; (6) communicating action plan effectiveness during technical reviews in the form specified by the project risk management plan.

 

 

 

 

 

g). Capture work products from technical risk management activities.

 

 

 

 

 

14

Configura-tion Manage-ment Process

 

 

          

a). Prepare a strategy to conduct configuration management for the system products and designated work products to include: (1) documenting how the project configuration management plan, if any, will be implemented; (2) identifying items to be put under configuration control; (3) identifying schema of identifiers to accurately describe a configuration item and its revisions or versions; (4) controlling changes to configuration items; (5) maintaining and reporting disposition and implementation of change actions to appropriate stakeholders including technical teams within the project; (6) enssuring that products are in compliance with specifications and configuration documentation during reviews and audits; (7) providing the appropriate reference configuration at the start of each product-line life-cycle phase; (8) obtaining appropriate tools for configuration management; and (9) training appropriate technical team members and other technical support and management personnel in the established configuration management strategy and any configuration management procedures and tools.

 

 

 

 

 

b). Identify baselines to be under configuration control to include: (1) listing of the configuration items to control; (2) providing each configuration item with a unique identifier; (3) identifying acceptance requirements for each baseline identified for control; (4) identifying the owner of each configuration item; and (5) establishing a baseline configuration for each configuration item.

 

 

 

 

 

c). Manage configuration change control to include: (1) establishing change criteria, procedures, and responsibilities; (2) receiving, recording, and evaluating change requests; (3) tracking change requests to closure; (4) obtaining appropriate approvals before implementing a change; (5) incorporating approved changes in appropriate configuration items; (6) releasing changed configuration items for use; and (7) monitoring implementation to determine whether changes resulted in unintended effects (e.g., have compromised safety or security of baseline product).

 

 

 

 

 

d). Maintain the status of configuration documentation to include: (1) maintaining configuration item description records and records that verify readiness of configuration items for testing, delivery, or other related technical efforts; (2) maintaining change requests, disposition action taken, and history of change status; (3) maintaining differences between successive baselines; and (4) controlling access to and release of configuration baselines.

 

 

 

 

 

e). Conduct configuration audits to include: (1) auditing baselines under control to confirm that the actual work product configuration matches the documented configuration, the configuration is in conformance with product requirements, and records of all change actions are complete and up to date; (2) identifying risks to the technical effort based on incorrect documentation, implementation, or tracking of changes; (3) assessing the integrity of the baselines; (4) confirming the completeness and correctness of the content of configuration items with applicable requirements; (5) confirming compliance of configuration items with applicable configuration management standards and procedures; and (6) tracking action items to correct anomalies from audit to closure.

 

 

 

 

 

f). Capture work products from configuration management activities to include: (1) a list of identified configuration items; (2) description of configuration items placed under control; (3) change requests, disposition of the requests, and rationale for the dispositions; (4) documented changes with reason for changes and change actions; (5) archive of old baselines; and (6) required reports on configuration management outcomes.

 

 

 

 

 

15

Technical Data Manage-ment Process

 

 

 

 

          

a). Prepare a strategy for the conduct of technical data management to include: (1) determining required data content and form and electronic data exchange interfaces in accordance with international standards or agreements; (2) establishing a framework for technical data flow within the project technical processes and to/from contractors; (3) designating technical data management responsibilities and authorities regarding origination, generation, capture, archiving, security, privacy, and disposal of technical data work products; (4) establishing the rights, obligations and commitments regarding the retention of, transmission of, and access to technical data items; (5) establishing relevant data storage, transformation, transmission and presentation standards and conventions to be used; (6) establishing project or program policy and agreements or legislative constraints; (7) describing the methods, tools, and metrics used during the technical effort and for technical data management; and (8) training appropriate technical team members and support and management personnel in the established technical data management strategy and related procedures and tools.

 

 

 

 

 

b). Collect and store required technical data to include: (1) identifying existing sources of technical data that are designated as outputs of the common technical processes; (2) collecting and storing technical data in accordance with the technical data management strategy and procedures; (3) recording and distributing lessons learned; (4) performing technical data integrity checks on collected data to confirm compliance with content and format requirements and identifying errors in specifying or recording data; and (5) prioritizing, reviewing, and updating technical data collection and storage procedures.

 

 

 

 

 

c). Maintain stored technical data to include: (1) managing the databases to maintain proper quality and integrity of the collected and stored technical data and to confirm that the technical data is secure and is available to those with authority to have access; (2) performing technical data maintenance as required; (3) preventing the stored data from being used or accessed inappropriately; (4) maintaining the stored technical data in a manner that protects it against foreseeable hazards, such as fire, flood, earthquake, and riots; and (5) maintaining periodic backups of each technical database.

 

 

 

 

 

d). Provide technical data to authorized parties to include: (1) maintaining an information library or reference index to provide data available and access instructions; (2) receiving and evaluating requests for technical data and delivery instructions; (3) confirming that required and requested technical data is appropriately distributed to satisfy the needs of the requesting party and in accordance with established procedures, directives, and agreements; (4) confirming that electronic access rules are followed before allowing access to the database and before any data is electronically released/transferred to the requester; and (5) providing proof of correctness, reliability, and security of technical data provided to internal and external recipients.

 

 

 

 

 

 

16

Technical Assessment Process

a). Prepare a strategy for conducting technical assessments to include: (1) identifying the plans against which progress and achievement of the technical effort are to be assessed; (2) establishing procedures for obtaining cost expenditures against work planned and task completions against schedule; (3) identifying and obtaining technical requirements against which product development progress and achievement will be assessed and establishing the procedures for conducting the assessments; (4) establishing events when TPMs, estimation or measurement techniques, and rules for taking action when out-of-tolerance conditions exist will be assessed; (5) identifying and planning for phase-to-phase technical reviews and WBS model-to-model vertical progress reviews, as well as establishing review entry and success criteria, review board members, and close out procedures; (6) establishing which technical effort work products will undergo peer review, the team members who will perform the peer reviews, and reporting requirements; and (7) training team members, support staff, and managers involved in conducting technical assessment activities.

 

 

 

 

 

b). Assess technical work productivity (progress and achievement against plans) to include: (1) identifying, collecting, and analyzing process measures (e.g., earned value measurements for measuring progress against planned cost, schedule, resource use, and technical effort tasks) and identifying and reporting cost-effective changes to correct variances; (2) monitoring stakeholder involvement according to the SEMP; and (3) monitoring technical data management against plans.

 

 

 

 

 

c). Assess product quality (progress and achievements against technical requirements) to include: (1) identifying, collecting, and analyzing the degree of technical requirement and TPM satisfaction; (2) assessing the maturity of the WBS-model products and services as applicable to the product-line life-cycle phases; (3) determining any variances from expected values of product performance and identifying and defining cost-effective changes to correct variances.

 

 

 

 

 

d). Conduct technical reviews to include: (1) identifying the type of technical reviews and each review's purpose and objectives (see Chapter 5 for specific technical reviews that apply); (2) determining progress toward satisfying entry criteria; (3) establishing the makeup of the review team; (4) preparing the review presentation materials; and (5) identifying and resolving action items resulting from the review.

 

 

 

 

 

e). Capture work products from the conduct of technical assessment activities to include: (1) identifying variances resulting from technical assessments; (2) identifying and reporting changes to correct variances; (3) recording methods used in doing assessment activities; (4) documenting assumptions made in arriving at the process and product measure outcomes; and (5) reporting corrective action recommendations.

 

 

 

 

 

 

17

 

 

 

 

Decision Analysis Process

a). Establish guidelines to determine which technical issues are subject to a formal analysis/evaluation process to include: (1) when to use a formal decisionmaking procedure, for example, as a result of an effectiveness assessment, a technical tradeoff, a problem needing to be solved, action needed as a response to risk exceeding the acceptable threshold, verification or validation failure, make-buy choice, evaluating a solution alternative, or resolving a requirements conflict; (2) what needs to be documented; (3) who will be the decision makers and their responsibilities and decision authorities; and (4) how decisions will be handled that do not require a formal evaluation procedure.

 

 

 

 

 

b). Define the criteria for evaluating alternative solutions to include: (1) the types of criteria to consider include the following: technology limitations, environmental impact, safety, risks, total ownership and life-cycle costs, and schedule impact; (2) the acceptable range and scale of the criteria; and (3) the rank of each criterion by its importance.

 

 

 

 

 

c). Identify alternative solutions to address decision issues to include alternatives for consideration in addition to those that may be provided with the issue.

 

 

 

 

 

d). Select evaluation methods and tools/techniques based on the purpose for analyzing a decision and on the availability of the information used to support the method and/or tool.

 

 

 

 

 

e). Evaluate alternative solutions with the established criteria and selected methods to include: (1) evaluation of assumptions related to evaluation criteria and of the evidence that supports the assumptions; and (2) evaluation of whether uncertainty in the values for alternative solutions affects the evaluation.

 

 

 

 

 

f). Select recommended solutions from the alternatives based on the evaluation criteria to include documenting the information that justifies the recommendations and gives the impacts of taking the recommended course of action.

 

 

 

 

 

g). Report the analysis/evaluation results/findings with recommendations, impacts, and corrective actions.

 

 

 

 

 

h). Capture work products from decision analysis activities to include: (1) decision analysis guidelines generated and strategy and procedures used; (2) analysis/evaluation approach, criteria, and methods and tools used; (3) analysis/evaluation results, assumptions made in arriving at recommendations, uncertainties and sensitivities of the recommended actions or corrective actions; and (4) lessons learned and recommendations for improving future decision analyses.

 

 

 

 

 


 

 

 

3.3   Plan to close gaps

 

This section would include textual descriptions about how the gaps noted in the matrix will be closed.

 

 

4.0 Lessons Learned

 

This section would include any lessons learned during the Center survey that was valuable to the Center and which might also be useful information for other Centers.

 

 

5.0 Center Best Practices

 

This section would include descriptions of what the Center considers its best practices and which practices might be used to update or improve the processes in the SE NPR.

 

6.0 Other

 

Any other information that the Center would like to document or pass on.

 



| TOC | ChangeLog | Preface | Chapter1 | Chapter2 | Chapter3 | Chapter4 | Chapter5 | Chapter6 | AppendixA | AppendixB | AppendixC | AppendixD | AppendixE | AppendixF | AppendixG | AppendixH | AppendixI | ALL |
 
| NODIS Library | Program Formulation(7000s) | Search |

DISTRIBUTION:
NODIS


This Document is Obsolete and Is No Longer Used.
Check the NODIS Library to access the current version:
http://nodis3.gsfc.nasa.gov