Skip to main content
AFARS

AFARS

Change Number: 2024-1231
Effective Date: 12/31/2024

CHAPTER 3 EVALUATION AND DECISION PROCESS

CHAPTER 3 EVALUATION AND DECISION PROCESS

3.1 Evaluation Activities

While the specific evaluation processes and tasks will vary between source selections, the basic objective remains constant – to provide the SSA with the information needed to make an informed and reasoned selection. To this end result, the evaluators will identify strengths, weaknesses, deficiencies, risks, and uncertainties applicable to each proposal. The process of identifying these findings is crucial to the competitive range determination, the conduct of meaningful discussions and debriefings, and the tradeoff analysis described in the Source Selection Decision Document (SSDD).

Reminder: The SSEB shall not perform comparative analysis of proposals or make source selection recommendations unless requested by the SSA (Reference DoD Source Selection Procedures 1.4.4.4.3).

While the below steps are identified in a linear manner, some of the process is iterative and some steps may be accomplished concurrently. Except where noted, these steps apply to the evaluation of both the cost and non-cost factors. The groups responsible for evaluating past performance, other non-cost factors, and cost/price normally perform their evaluations in parallel. The PCO and SSEB Chairperson shall ensure that the evaluation of each proposal is performed in a fair, integrated and comprehensive manner.

Best Practice: Identify acquisition teams at the requirements development phase and provide comprehensive training on the entire process, from acquisition planning through source selection decision. Provide SSEB training covering the final RFP and SSP approximately one to two weeks prior to receipt of proposals.

Step 1: Conduct SSEB Training Prior to receipt of proposals, each evaluator must become familiar with all pertinent documents (e.g., the RFP and SSP). Source selection evaluation training shall be provided/required for each evaluation and conducted by the PCO, at the PCO’s request, and under their supervision; the evaluation training may be conducted by another qualified source selection expert or an agency team. Training shall include an overview of the source selection process, required documents, and include a detailed focus on how to properly document rationale for the assigned rating, as well as the assessment of each offeror’s proposal’s strengths, weaknesses, uncertainties, risks, and deficiencies. Designated Legal Counsel is recommended to assist in the source selection evaluation training as well, providing content relating to ethics, procurement integrity, the protection of source selection information, and signing of non-disclosure agreements.

The training will be based on the contents of the DoD Source Selection Procedures and this supplement. Defense Acquisition University (DAU) training may be useful and can be required for SSEB members at the PCO or SSA’s discretion. Ensuring all SSEB members have current, and a standardized level of training is a priority and is especially crucial when evaluators have no previous or varying levels of prior source selection evaluation experience, as is frequently the case. Specific organization or requirement information should be included as part of the initial SSEB training.

Step 2: Perform Initial Screening of Proposals Upon receipt of proposals, the PCO or designee shall conduct an initial screening to ensure offerors’ proposals comply with the RFP instructions for submission of all required information, including electronic media, in the quantities and format specified in the RFP. The screening of prime and major subcontractor names to ensure no conflict of interest for the SST is strongly recommended, especially if contract advisors are used as part of the evaluation team. Figure 3-1 is an extract of a sample proposal screening checklist that may be used to accomplish this initial screening and should be tailored to match the specific proposal submission requirements of the RFP.

Image 0

Figure 3-1: Sample Proposal Screening Checklist (Extract)

Step 3: Sharing of Cost/Price Information The SSEB Chairperson and PCO, in coordination with the SSA, shall determine whether cost information will be provided to the technical evaluators and, if it will be provided, under what conditions, when, and what information shall be provided. The SSEB Chairperson and PCO shall ensure the small business participation evaluation team verifies the total proposed price (not individual cost elements) and any subcontracting information with the Cost/Price team. This will ensure the dollar amounts are consistent with what is being proposed in the small business participation proposal.

Step 4: Conduct Initial Evaluation Evaluators will independently read and evaluate the offeror’s proposal against the criteria identified in the RFP and SSP, document their initial evaluation findings (e.g., strengths, weaknesses, deficiencies, risks, and uncertainties), and draft proposed evaluation notices (ENs) for each finding to be addressed, ensuring resulting narrative is sound and meaningful.

Step 5: Identify and Document Areas of the Proposal That May Be Resolvable

Through Clarifications or Communications If information is required to enhance the government’s understanding of the proposal, the PCO may request amplifying or other relevant information from the offeror by means of the clarification or communication process (see FAR 15.306). The PCO should engage the legal advisor prior to conducting this process. (See Figure 3-3 for a detailed discussion of the differences between clarifications, communications, and discussions.)

Step 6: Assign Ratings for Non-Cost Evaluation Factors When Using the Tradeoff Process – At this point, the evaluators may or may not individually assign ratings to each evaluation factor or subfactor for which they are responsible. At a minimum, each evaluation team (factor, subfactor) must convene to discuss the offeror’s proposal. The purpose of the discussion is to share their views on the offeror’s strengths, weaknesses, deficiencies, risks, and uncertainties related to their assigned evaluation factor(s) / subfactor(s) and to reach a team consensus on findings and rating as appropriate.

NOTE : Ratings must be supported by evaluation findings and narrative statements.

 Consensus requires a meeting of the minds on the assigned rating and associated deficiencies, strengths, weaknesses, uncertainties, and risks. NOTE: A simple averaging of the individual evaluation results does not constitute consensus.

 In exceptional cases where the evaluators are unable to reach consensus without unreasonably delaying the source selection process, the evaluation report shall include the majority conclusion and the dissenting view(s) in the form of a minority opinion, each with supporting rationale. The report must be briefed to the SSAC (if used) and the SSA.

Step 7: Finalize ENs ENs will include deficiencies, significant weaknesses, weaknesses (and any uncertainties not resolved through clarifications or communications) as well as ENs for significant strengths, and strengths, if dictated by the SSP.

Step 8: Prepare Summary Evaluation Reports for Each Factor Each Factor Chair will prepare a summary report for their respective factor which provides a discussion of their associated findings. These reports will help form the Summary SSEB Evaluation Report and must be prepared at each phase of the process: initial, interim, and final evaluations.

Step 9: Prepare a Summary SSEB Evaluation Report The final step is for the SSEB Chairperson to prepare a summary report for each proposal that includes the evaluated price, the rating for each evaluation factor and subfactor, and a discussion of the associated findings (strengths, weaknesses, deficiencies, risks, and uncertainties). A Summary SSEB Evaluation Report must be prepared at each stage of the process: initial, interim, and final evaluations.

Cost or Price Evaluation

Figure 3-2 below provides a side-by-side comparison of what price analysis, cost analysis, and cost realism analysis should include and when each must be used. For detailed instructions and professional guidance on how to conduct these analyses, refer to FAR 15.4, and the Army Cost and Price Portal on the ODASA(P) PAM Knowledge Management Portal. https://armyeitaas.sharepoint-mil.us/sites/ASA-ALT-PAM-ProcProc/SitePages/CostPrice.aspx.

Figure 3-2: Comparison of Price, Cost, and Cost Realism Analysis

The following are some general evaluation guidelines and recommendations for evaluating cost/price:

 The Independent Government Cost Estimate (IGE) may play a key role in cost/price analysis. It serves as a benchmark for price analysis and in cost realism and may also serve as a benchmark for individual cost elements. The IGE must contain a rationale of how it was developed (e.g., what estimating tools were used and what assumptions were made) in order to properly evaluate cost/price.

 With the approval of the SSEB Chairperson and the PCO, the cost/price evaluators should coordinate with the non-cost factor evaluation team leads, as necessary, to ensure consistency between the proposed costs/prices and other portions of the proposal. This interchange between SSEB factor teams is part of the initial validation exercise and should be continued throughout the evaluation process to ensure that interrelationships are promptly identified, and the evaluation findings reflect their recognition. For example, the technical evaluation may reveal areas where each offeror’s approach is inadequate or its resourcing unrealistic, given the proposed approach. The technical evaluators and the cost evaluators should crosswalk technical deficiencies and weaknesses and their impact on cost to ensure an adequate understanding of risks and to ensure proper cost realism adjustments can be made to the proposed costs, if applicable.

 When conducting price analysis, consider not only the total price, including options, but also the prices for the individual Contract Line Items to ensure they are not unbalanced. Unbalanced pricing exists when the price of one or more contract line items is significantly over or understated as indicated by the application of cost/price analysis techniques. The PCO, with concurrence of the SSA, if permitted by the RFP may reject the offer if they determine that this poses an unacceptable risk to the government. For more information on unbalanced pricing, see FAR 15.404-1(g).

 For fixed-price contracts, the evaluation can be as simple as consideration of adequate price competition/comparison of proposed prices received in response to the solicitation and ensuring prices are fair and reasonable.

 Pricing from proposals with marginal or unacceptable technical ratings should only be included in comparison of proposed prices after determining that the offeror included all necessary requirements in the proposed price (for example, a proposal with a significant weakness or deficiency based on a missing item, process, or labor category in the technical proposal is likely to have omitted the same in the proposed price). If only one proposal is determined to be technically acceptable, adequate price competition should not be used as the sole basis for determining price reasonableness.

 For cost-reimbursement contracts, you must analyze the offerors’ estimated costs for both realism and reasonableness. In a competitive environment, the cost realism analysis enables you to determine each offeror’s probable cost of performance. This precludes an award decision based on an overly optimistic cost estimate.

Technical Evaluation

Either of DoD’s two methodologies for evaluation (Reference DoD Source selection Procedures 3.1.2.1. Methodology 1 – Separate Technical/Risk Rating Process or 3.1.2.2. Methodology 2 – Combined Technical/Risk Rating) may be utilized when evaluating proposals. The methodology chosen should appropriately ‘fit’ the individual requirement and procurement action with all factors considered.

Past Performance Evaluation

In past performance evaluations, the offeror’s performance record on similar contract efforts is examined, with the information used to reasonably predict whether the offeror will successfully perform the subject requirement. It is important to understand the difference between an offeror’s experience and its past performanceexperience is what (work) the offeror has done; past performance is how well the offeror did it.

FAR Parts 9, 12, 15, 36, and 42 contain regulatory policies related to the evaluation of past performance. FAR Part 36 provides specific procedures, forms, and thresholds for evaluation of Architect & Engineering and construction acquisitions.

The Army provides source selection guidance, resources, and best practices for use by the Army Contracting Enterprise (ACE) on the Procurement.Army.Mil (PAM) platform (see https://armyeitaas.sharepoint-mil.us/sites/ASA-ALT-PAM-ProcProc/SitePages/SourceSelection.aspx).

Recency. (No Supplemental Army Guidance. – Reference DoD Source Selection Procedures 3.1.3.1.1.)

Relevance. A helpful tool that may assist in determining/verifying the relevancy of a contract referenced in an offeror’s past performance is to locate and review the contract and requirements in Electronic Document Access (EDA). NOTE: EDA requires user registration within the Wide Area Workflow (WAWF) suite of tools located on the Procurement Integrated Enterprise Environment (PIEE) site https://piee.eb.mil/. To ensure your ability to access contract records, complete this process well in advance of the start of source selection. . (Reference DD Source Selection Procedures 3.1.3.1.2)

Quality of Products or Services. (No Supplemental Army Guidance. Reference DoD Source Selection Procedures 3.1.3.1.3.)

Sources of Past Performance Information. Where possible, use past performance information available from government-wide and agency-wide databases. Use of such information will help to expedite and streamline the evaluation process.

 If possible, contact two points of contact on each contract effort selected for in-depth review. The PCOs, SBPs, CORs, Fee Determining Officials, and program management office representatives are often excellent sources of information.

 If multiple points of contact are providing past performance information on contract (for example, the PCO, SBP, and PM), arrange for submission of consolidated input from these sources. This may remove the need for the evaluation team to reconcile variances in past performance information submitted.

 In assessing the feedback, pay particular attention to the source of that feedback and their familiarity with the requirements of the contract being assessed. For example, end users may be unfamiliar with the contract requirements or certain issues and resolution arising from contract performance may not be apparent to them.

 The agency has an obligation to consider information that has a bearing on an offeror’s past performance if the SST is aware of (or should have been aware of) the information. For example, an agency may not ignore contract performance by an offeror involving the same agency, the same services, and/or the same PCO, simply because an agency official fails to complete the necessary assessments or documentation. Consult legal counsel on how to address this type of information.

Addressing Adverse Past Performance Information. When adverse past performance is obtained, as appropriate, contact the respective point of contact for that contract to obtain further information about the circumstances surrounding the situation. Additionally, and when practical, contact at least one other individual to get a second perspective on the offeror’s performance on the subject acquisition. Consider the context of the performance problems, any mitigating circumstances, the number and severity of the problems, the demonstrated effectiveness of corrective actions taken, and the overall work record.

If there is past performance information that adversely impacts an offeror’s proposal assessment, provide the offeror an opportunity to address any such information on which it has not had a previous opportunity to comment. This opportunity may occur during clarifications, communications, or discussions, depending upon whether discussions are anticipated.

When addressing adverse past performance information, identify the contract, but do not identify the name of the individual who provided the information. Summarize the problem(s) with sufficient detail to give the offeror a reasonable opportunity to respond.

NOTE: Past performance is considered a responsibility-type determination for purposes of SBA’s Certificate of Competency (COC) program, even if the next acceptable offer is also from a small business (See FAR 19.601). FAR 19.602-1(a) requires agencies to refer a finding of non-responsibility to the SBA if the determination would preclude award. Therefore, if the PCO refuses to consider a small business concern for award after evaluating the concern's past performance as a non-trade-off evaluation factor (e.g., a pass/fail, go/no-go, or acceptable/unacceptable), the matter must be referred to the SBA. Alternatively, when past performance will be an evaluation factor in the trade-off process, SBA referral is not required because the evaluation of past performance is part of a comparative, best value evaluation and not a responsibility determination.

Small Business Evaluation

The Army methodology for rating the small business participation factor is to utilize the DoD Source Selection Procedures rating scheme for Small Business Participation (See DoD Source Selection Procedures 3.1.4.1.2 – Table 6). Solely relying on acceptable/unacceptable or pass/fail rating schemes are the least preferred method of evaluating small business participation in best value tradeoff source selections. This rating scheme does not allow evaluators to give higher ratings to offerors that significantly exceed the stated small business goals or submit proof of binding agreements with small businesses and therefore are discouraged.

Additionally, small business past performance should be considered and is required in some cases (See FAR 15.304(c)(3)(ii)). In looking at small business past performance, the government evaluates how well the offeror has performed in achieving its small business goals. Remember that this should only be evaluated for other than small businesses in assessing their compliance with FAR 52.219-9. A tool regularly used by the government is the electronic Subcontracting Reporting System (eSRS).

NOTE: DFARS PGI 215.304 provides an example that indicates evaluation of past performance compliance within a separate small business participation factor. This may instead be evaluated under the past performance factor, but not in both factors .

Small business offerors (other than firms utilizing the HUBZone price preference) proposing on unrestricted requirements are not held to the requirements of FAR 52.219-14 Limitations on Subcontracting because the clause is applicable to small business set-aside procurements only. However, small business offerors should meet the small business participation factor goals through any, or a combination of the following: performance as a prime small business, performance as a joint venture, or small business subcontracting.

DoD Source Selection Procedures 3.1.6 require the offeror to include a commitment signed by both the offeror and the subcontractor certifying that, if a contract is awarded resulting from the proposal, the parties commit to joint performance as proposed when subcontractor experience is submitted for consideration as part of the proposal. If the signed commitment is not fully executed by both parties and provided with the Past Performance Proposal, subcontractor references will not be evaluated or considered.

3.2 Documentation of Initial Evaluation Results

See Army template source selection documents located in PAM - Template Library https://spcs3.kc.army.mil/asaalt/procurement/SitePages/NewTemplates.aspx).

Following initial evaluations and all required reviews (see DoD Source Selection Procedures 3.2.1), award will either be made without discussions or with discussions (see DoD Source Selection Procedures 3.2.2 and 3.2.3).

Types of Exchanges

After receipt of proposals, there are three types of exchanges that may occur between the government and offerors -- clarifications, communications, and negotiations / discussions. When they occur, their purpose and scope, and whether offerors are allowed to revise their proposals as a result of the exchanges are different for each.

Clarifications may only be used when an award will be made without discussions (see FAR 15.306(a)(1) and DoD Source Selection Procedures 3.3.1).

Communications (see FAR 15.306(b) and DoD Source Selection Procedures 3.5.2) and discussions (see FAR 15.306(b) and DoD Source Selection Procedures 3.5) are used when a competitive range will be established. All SSEB exchanges must be accomplished through the use of evaluation notifications (ENs) .

Figure 3-3: Comparison of Types of Exchanges (After Receipt of Proposals

Conducting Exchanges with Offerors

The PCO controls all exchanges with offerors. Before participating in any exchanges, the PCO shall review the ground rules with the team members. Exchanges may be conducted in-person, telephonically, via videoconference, or via written correspondence.

During exchanges with offerors, the government may not:

 Favor one offeror over another;

 Reveal an offeror’s technical solution to another offeror;

 Reveal an offeror’s price to another offeror without that offeror’s permission;

 Knowingly disclose source selection information, or reveal the name of individuals providing past performance information;

 Reveal source selection information in violation of statutory and regulatory requirements.

3.3 Award Without Discussions

Reminder: Discussions should be conducted and are the expected course of action for all acquisitions with an estimated value of $100 million or more unless inappropriate for a particular circumstance. Award without discussions on complex, large procurements is discouraged and seldom in the government’s best interest. (Reference DFARS 215.306 and DoD Source Selection Procedures 3.2.3)

3.4 Competitive Range Decision Document – (No Supplemental Army Guidance)

3.5 Discussion Process

Competitive Range

If the competitive range is further reduced for purposes of efficiency, the basis for this reduction must be adequately documented. Considerations for further restricting competition may include expected dollar value of the award, complexity of the acquisition and solutions proposed, and extent of available resources (see FAR 15.306(c)).

NOTE: Predetermined cut-off ratings (e.g., setting a minimum rating or identifying a predetermined number of offerors to be included in the competitive range) must not be established. The government may not limit a competitive range for the purposes of efficiency on the basis of technical scores alone.

The PCO, with approval of the SSA, should continually reassess the competitive range as discussions and evaluations continue to ensure neither the government nor the offerors waste resources by keeping proposals in the competitive range that are no longer contenders for award (see DoD Source Selection Procedures 3.4 and 3.5.3).

Discussions

The government’s objectives, to include the competitive range decision narrative, shall be fully documented in the prenegotiation objective memorandum (POM) prior to entering into discussions (See FAR 15.406-1 and DFARS PGI 215.406-1).

Meaningful discussions do not include advising the individual offerors on how to revise their proposal nor does it include information on how their proposal compares to other offerors’ proposals.

Additionally, discussions must not be misleading. An agency’s framing of a discussion question may not inadvertently mislead an offeror to respond in a manner that does not address the agency’s concerns, or that misinforms the offeror concerning its proposal weaknesses or deficiencies or the government’s requirements.

3.6 Final Proposal Revisions – (No Supplemental Army Guidance)

3.7 Documentation of Final Evaluation Results

At the request of the SSA, the SSAC and/or SSEB members may also present the evaluation results by means of one or more briefings. Figure 3-4 illustrates a sample proposal evaluation matrix that can be used during for the briefing. The documentation should be clear and concise and should cross-reference, rather than repeat, information in existing documents as much as possible (e.g., the SSP, evaluation team reports, etc.). In rare instances, if the SSA identifies concerns with the evaluation findings and/or analysis, the SSA may require the SSEB and/or SSAC to conduct a re-evaluation and/or analysis to address these concerns. The evaluation results shall clearly be documented in the SSEB Report (See DoD Source Selection Procedures Paragraph 1.4.4.4.1.5.1, Paragraph 2.2.6, Paragraph 3.2, Paragraph 3.3.2, Paragraph 3.7, and Paragraph 4.1.9).

Figure 3-4: Sample Proposal Evaluation Matrix.

*There is NO significance implied by use of alphabetic identifiers to differentiate between the example offerors.

3.8 Conduct and Document the Comparative Analysis

When performing the comparative analysis, the SSAC will consider each offeror’s total evaluated price and the discriminators in the non-cost ratings as indicated by the SSEB’s evaluation findings for each offeror. Consider these differences in light of the relative importance (or weight) assigned to each evaluation factor .

3.9 Best-Value Decision – (No Supplemental Army Guidance)

3.11 Debriefings – See Appendix A

3.12 Integrating Proposal into the Contract

When planning the acquisition/source selection, coordinate closely with legal counsel to select the best method to incorporate beneficial aspects, such as the small business participation commitment document or above-threshold performance, into the award document. This is vital when aspects of a proposal are cited or emphasized in the SSDD because they were identified as beneficial to the government, especially when the aspects of the proposal support a price premium paid by the government. The following methods may be considered:

Use of Attachment. Beneficial aspects can be captured in a separate document attached to the PWS/Statement of Work (SOW)/SOO which clearly defines the changes to requirements based on specific beneficial aspects but leaves the original PWS/SOW/SOO untouched.

Section C PWS/ SOW/ SOO, System Specifications, Section H – Special Contract Requirements, or Other. Above-threshold performance may be captured within the PWS/SOW/SOO, System Specifications, Section H - Special Contract Requirements, or otherwise captured in the contract document, depending upon what is proposed. If using this method, care must be executed not to permanently increase the government’s requirements in future RFPs unless it is an intentional decision on the part of the organization to do so.

Best Practice: Methods other than an addendum to the PWS/SOW/SOO addendum may be preferred due to the possibility of inadvertent inclusion in subsequent contracts (causing requirements creep). The intent is not to increase the government’s minimum requirements, but to hold a particular offeror to their proposal. (The government may later determine that the minimum requirement should include the higher performance and include it at time of re-compete).

Model Contract Process. The RFP should discuss the model contract process (if used) in Section L (or equivalent) to ensure that offerors know that they will be contractually bound to their proposed above-threshold performance. Include language in the RFP describing how the government will capture the promised above-threshold performance prior to award. Above-threshold performance and significant strengths the government expects to capture in the contract should be addressed with the offerors during the discussions process.

When used, model contracts are typically sent to offerors prior to closing discussions and submission of Final Proposal Revisions (FPRs) to include the above-threshold performance that will be captured upon contract award, thereby ensuring that all parties are aware of what is expected of the prospective awardee. Ensure that each offeror’s proposed above-threshold performance is carefully and correctly incorporated into each model contract and the final narrative is consistent with the letter to the offeror requesting the FPR.

Incorporation of Portions of Offeror’s Technical Proposal by Reference. The RFP should advise offerors that any part of their proposal can be incorporated by reference. Only incorporate those portions of an offeror’s technical proposal that provide benefit to the government.

Awarding the Contract(s) and Posting to SAM.gov

After the SSA has signed the source selection decision document, the PCO will execute and distribute the contract award(s) and post to SAM.gov in accordance with FAR 5.303, DFARS 205.303, and AFARS 5105.303 Announcement of contract awards. Congressional notification may be required IAW FAR 5.303 and AFARS 5105.303. For Section 8(a) Set-Asides, the SBA shall be notified IAW FAR 19.804. For Small Business Programs, the apparent unsuccessful offerors shall be provided the pre-award notice required by FAR 15.503.

Notification to Unsuccessful Offerors

The PCO must notify unsuccessful offerors in writing after contract award or whenever their proposals are eliminated from the competition within the timeframe identified in Figure 3-5 below. This chart provides a side-by-side comparison of the differences between preaward and postaward notices. The type of information that must be included in the notice will depend upon whether it is sent before or after contract award.

Figure 3-5: Comparison of Preaward and Postaward Notices