Your job is not to re-evaluate the proposals. Your job is to make sure the evaluators did theirs correctly, consistent with the solicitation. This training walks through what that looks like in practice.
You are not the technical expert. The end user, the requiring activity, the engineers on the evaluation team: they are the ones who assess whether a proposal is technically acceptable. Your role is oversight. You review the evaluation to confirm it was conducted fairly and consistently with what the solicitation told offerors they would be evaluated on.
That means when you receive a technical evaluation report, you are checking the evaluators' work, not redoing it. If the solicitation said proposals would be evaluated on three factors, the evaluation should address all three factors. If Section M established an adjectival rating scale, the evaluators should be using that scale and their narratives should support the ratings they assigned.
Think of it this way: the evaluators tell you what they found. You verify that how they found it aligns with the rules the solicitation laid out.
When you review a technical evaluation, you are checking for the following:
Did the evaluators evaluate against the stated criteria? Section M of the solicitation defines the evaluation factors and subfactors. The evaluation should directly address those factors. If Section M lists "Technical Approach," "Staffing Plan," and "Transition Plan" as subfactors, the evaluation should have findings under each of those subfactors for each offeror.
Did they apply the rating methodology correctly? If Section M says "Outstanding" means the proposal significantly exceeds the requirement, and the evaluators rated a proposal "Outstanding" but their narrative only says it meets the requirement, that is a disconnect you need to address.
Are the narratives supported? Ratings without rationale are a protest risk. A narrative that says "The offeror's approach is acceptable" without explaining why is not sufficient. You need enough detail so that someone reading the evaluation later (including a GAO attorney) can understand the basis for the rating.
Did they introduce unstated evaluation criteria? This is one of the most common and most dangerous errors. If the solicitation does not list "years of experience" as an evaluation criterion, the evaluators cannot downgrade a proposal because the proposed staff only has two years of experience. Evaluators must stay within the four corners of the solicitation.
Is the evaluation consistent across offerors? If one offeror gets dinged for not addressing a particular requirement but another offeror who also did not address it does not get the same treatment, that is a problem. The same standards have to apply to everyone.
You are not substituting your judgment for the evaluators'. If the technical team says a proposed staffing plan is weak because it only has three full-time employees for a requirement that clearly needs more, and their reasoning is sound and tied to the solicitation, your job is not to second-guess whether three people really is too few. Your job is to verify that "staffing adequacy" was a stated criterion and that the evaluators explained their reasoning.
You are also not rubber-stamping. If you receive an evaluation that assigns ratings without meaningful narrative, or one that appears to apply criteria not in the solicitation, it is your responsibility to send it back to the evaluators for correction. You are the last line of defense before the source selection decision is made.
Section L (Instructions to Offerors) tells contractors what to submit and how to organize their proposals. It defines page limits, required content, and submission format. Section L is where you shape what you will receive.
Section M (Evaluation Criteria) tells contractors how their proposals will be evaluated. It defines the factors, subfactors, relative importance, and rating methodology. Section M is the standard the evaluators must follow.
These two sections work together. Section L tells the offeror "submit a staffing plan showing the qualifications of key personnel." Section M tells them "the staffing plan will be evaluated for adequacy of qualifications and alignment with the PWS requirements." The evaluation should then assess each proposal's staffing plan against those specific criteria and no others.
When you review the evaluation, you are essentially holding Section M in one hand and the evaluation report in the other, and confirming they match.
Here are the issues that come up most often when reviewing technical evaluations:
| Problem | What It Looks Like | What to Do |
|---|---|---|
| Unstated criteria | Evaluators penalize an offeror for something not listed in Section M. Example: "Offeror did not propose 24/7 coverage" when the solicitation only required business hours support. | Send it back. The evaluators need to remove the finding or explain how it ties to a stated criterion. |
| Ratings without narrative | "Technical Approach: Acceptable." No explanation of what made it acceptable or what the evaluators considered. | Send it back. Each rating needs a narrative that explains the basis. |
| Narrative contradicts rating | Narrative identifies multiple weaknesses, but the rating is still "Outstanding." | Ask the evaluators to reconcile. Either the weaknesses are not as significant as the narrative suggests, or the rating is too high. |
| Factors missing | Section M lists three subfactors but the evaluation only addresses two of them. | Send it back. Every stated factor and subfactor must be evaluated. |
| Inconsistent treatment | Offeror A gets a weakness for a vague transition plan, but Offeror B's equally vague transition plan receives no mention. | Flag it. The evaluators need to apply the same standard to all offerors. |
| Evaluators re-writing the PWS | The evaluation imposes specific approaches or solutions that the solicitation left open. Example: "Offeror should have proposed ITIL framework" when the PWS did not require ITIL. | Send it back. Evaluators assess what was submitted against stated criteria, not against their personal preferences. |
There is nothing wrong with returning an evaluation to the technical team for revision. In fact, it is expected. Your role as the Source Selection Authority (or advisor to the SSA) requires you to ensure the evaluation record is defensible. If it is not, returning it is the responsible thing to do.
When you send it back, be specific. Do not say "this needs work." Tell the evaluators exactly what the issue is: "The narrative for Offeror B under the Staffing Plan subfactor does not support the Unacceptable rating. Please provide specific findings tied to the Section M criteria." Give them clear direction so they can fix it efficiently.
Keep a record of your review comments and the evaluators' responses. This becomes part of the source selection documentation and demonstrates that the CO exercised proper oversight.
You are the contracting officer on a competitive best-value tradeoff acquisition for IT Help Desk Support Services at Ellsworth AFB. The requirement is for Tier I and Tier II help desk support, 0700-1700 Monday through Friday, supporting approximately 3,000 users across the installation. The contract is a firm-fixed-price, one base year plus four option years.
Below are the relevant Sections L and M from your solicitation. Read them carefully. The evaluation that follows should be judged against these sections and nothing else.
L.1 Technical Volume Instructions. The technical volume shall not exceed 20 pages (single-spaced, 12-point font, 1-inch margins). The technical volume shall address the following:
(a) Technical Approach. Describe the offeror's approach to providing Tier I and Tier II help desk support. Address ticket intake, triage, escalation procedures, and resolution processes. Describe how the offeror will meet the performance requirements in the Performance Work Statement (PWS), including the 15-minute initial response time and 95% first-call resolution rate for Tier I tickets.
(b) Staffing Plan. Identify the number and qualifications of personnel proposed to perform the work. Identify key personnel by name and include resumes as an attachment (resumes do not count against the page limit). Describe how the offeror will ensure adequate staffing during the required hours of performance (0700-1700, M-F).
(c) Transition Plan. Describe the offeror's plan to transition into full performance within 30 days of contract start. Address knowledge transfer, system access, and coordination with the outgoing contractor and government IT staff.
M.1 Evaluation Factors. The Government will evaluate proposals based on the following factors, listed in descending order of importance:
Factor 1: Technical Approach (Most Important)
Factor 2: Staffing Plan
Factor 3: Transition Plan (Least Important)
Factor 4: Price (evaluated separately; see M.3)
M.2 Adjectival Rating Scale. Non-price factors will be evaluated using the following scale:
Outstanding: Proposal significantly exceeds the stated requirements in a way that is beneficial to the Government. Contains one or more strengths and no weaknesses.
Acceptable: Proposal meets the stated requirements. May contain strengths. No weaknesses that would impair contract performance.
Unacceptable: Proposal fails to meet one or more stated requirements. Contains one or more deficiencies that would impair contract performance.
M.3 Price Evaluation. Price will be evaluated for reasonableness and completeness. Price is less important than the non-price factors combined. However, as non-price ratings become more equal, price becomes more important.
M.4 Best Value Determination. The Government intends to award to the offeror whose proposal represents the best value to the Government, considering the non-price factors and price. The Government may award to other than the lowest-priced offeror.
The technical evaluation team submits the following evaluation for Offeror A (Pinnacle IT Solutions). As you read it, look for problems. Then check the annotations below.
Factor 1: Technical Approach –
"The offeror's technical approach is acceptable. They described a tiered support model and addressed ticket management. However, the offeror did not propose using the ITIL framework, which is the industry standard for IT service management. The offeror also only proposed ServiceNow for ticketing rather than the more robust BMC Remedy platform."
Factor 2: Staffing Plan –
"Unacceptable. The offeror proposes 8 help desk technicians. We believe this is insufficient. Additionally, the proposed help desk manager only has 4 years of experience managing IT support operations, which is not enough for a contract of this size."
Factor 3: Transition Plan – (no rating provided)
"The offeror's transition plan was included in their proposal."
After your feedback, the evaluation team revises their report. Below is the corrected version for the same offeror. Notice how every finding ties back to the solicitation.
Factor 1: Technical Approach –
Strength: The offeror described a clearly defined tiered support model with specific escalation thresholds between Tier I and Tier II. The proposal details an automated ticket routing system that categorizes incoming requests by urgency and routes them to the appropriate tier within 5 minutes of receipt, which supports meeting the 15-minute initial response requirement in PWS paragraph 3.2. The offeror also proposed proactive monitoring of recurring ticket categories with monthly trend reports to the government, which exceeds the reporting requirements in PWS paragraph 5.1.
No weaknesses identified.
Factor 2: Staffing Plan –
The offeror proposes 8 help desk technicians (6 Tier I, 2 Tier II) and 1 help desk manager. The proposed staffing provides coverage during the required performance hours (0700-1700, M-F) as stated in PWS paragraph 2.1. Resumes for key personnel demonstrate relevant help desk and IT support qualifications. The proposed manager holds a CompTIA Project+ certification and has managed IT service desk operations for organizations with 2,000+ users, which is relevant to this requirement supporting approximately 3,000 users.
No weaknesses identified.
Factor 3: Transition Plan –
The offeror proposes a 30-day transition plan organized into three phases: (1) knowledge transfer with the outgoing contractor during days 1-10, (2) parallel operations with shadowing during days 11-20, and (3) independent operations with government oversight during days 21-30. The plan addresses system access coordination with the base communications squadron and includes a transition risk register. The proposed approach meets the 30-day transition requirement in Section L, paragraph L.1(c).
No weaknesses identified.
You are the CO reviewing technical evaluations against the same IT Help Desk solicitation from the Scenario tab. For each situation, decide what action you would take.
The evaluators rated Offeror B's Technical Approach as Unacceptable. The narrative states: "The offeror's approach does not adequately address the requirements." No specific requirements are cited, and no deficiencies are identified.
What do you do?
The evaluators gave Offeror C's Staffing Plan a weakness because the proposed Tier II technicians do not hold CompTIA A+ certifications. The PWS requires "qualified IT support technicians" but does not list specific certifications. Section M evaluates the Staffing Plan for "adequacy of qualifications and alignment with PWS requirements."
What do you do?
Offeror D's Transition Plan is rated Outstanding. The narrative states: "The offeror proposes a 45-day transition, which exceeds the 30-day requirement. However, the plan does not address knowledge transfer with the outgoing contractor, and the plan lacks detail on how system access will be obtained." The evaluators identified two specific shortcomings but still assigned Outstanding.
What do you do?
Offeror E and Offeror F both proposed similar technical approaches for ticket management. Both described a manual triage process rather than automated routing. The evaluators gave Offeror E a weakness under Technical Approach for "lack of automation in ticket triage." Offeror F received no such finding and was rated Acceptable.
What do you do?