Customer Education

Evaluation Factors for Customers

How to turn "best value" into specific questions, useful evidence, and fair ways to compare contractors before award.

Evaluation factors are future questions. Before award, the Government is trying to predict which contractor can actually meet the need. The customer helps by explaining what the team needs to see, read, test, or hear before that prediction feels reasonable.

This lesson keeps it simple. The goal is to help customers contribute well to evaluation without becoming junior contracting officers. The competency is the same whether the buy is a small commercial purchase or a complex services package: ask for the evidence that helps the Government choose well.

01Evaluation factors are future questions

The customer usually starts with a noun: staffing plan, technical approach, schedule, resumes, past performance, transition plan, price. Those nouns aren't bad, but they aren't enough.

The useful question is:

What do we need to know or see from this contractor to determine whether what they are offering will meet our need?

That question turns a generic factor into an evaluable factor. "Staffing plan" becomes "can they recruit, retain, and replace qualified people for high-turnover positions without interrupting service?" "Schedule" becomes "is their proposed timeline realistic for site access, onboarding, material lead times, and Government review?"

No secret questions

If the Government wants to use something to decide who wins, industry needs to know that before proposals are due. Evaluation criteria aren't a place for hidden preferences, surprise standards, or "we'll know it when we see it."

02Best value, plainly

"Best value" sounds vague but has a real meaning: the Government picks using the evaluation factors and relative importance stated up front. The criteria announced in the solicitation drive the decision. Personal preference, vendor familiarity, and history don't.

Price

Always in the room

For customer purposes, assume price or cost will be evaluated. The CO and price team handle the formal analysis.

Quality

What makes one offer better?

Quality usually shows up through technical approach, management, staffing, personnel, schedule, risk, or past performance.

Tradeoff

Is better worth more?

If non-price factors matter, the file needs a way to explain whether a higher-rated offer is worth a higher price.

Simplicity

Don't overbuild it

Commercial and lower-risk buys shouldn't ask for a dissertation when a simple quote, capability statement, and price will do.

Sized to the buy

For simpler buys — small purchases, commercial supplies, routine services under streamlined procedures — the evaluation is lighter. A short comparative analysis, a "best suited" determination, or a quick price-and-capability check may be all the Government does. The bigger and more complex the action, the more formal the evaluation. Match the work to the buy: don't ask for a 50-page proposal on a $25,000 office-supply order.

03Instructions vs. evaluation

Two parts of every solicitation work as a pair: the instructions tell offerors what to submit, and the evaluation explains how the Government will look at what they submitted. The names and formats vary by solicitation, but the idea is simple.

Piece Plain meaning Customer question
Instructions What the offeror has to submit. What evidence do we need from industry to evaluate this fairly?
Evaluation How the Government will evaluate what was submitted. What are we trying to determine from that evidence?
Alignment The evidence requested and the factor evaluated need to match. Are we asking for everything we plan to evaluate, and evaluating only what we asked for?

If the evaluation says the Government will evaluate the staffing approach, the instructions need to tell offerors what staffing information to submit. If the instructions ask for resumes, but the evaluation never mentions personnel qualifications, the solicitation is creating work without explaining why.

04Factor vs. evidence

A factor is the thing the Government cares about. Evidence is what the offeror submits so the Government can judge it.

Factor Possible evidence What the Government is trying to learn
Staffing approach Recruiting plan, retention plan, vacancy backfill plan, labor categories, staffing chart, resumes, certifications. Can the contractor keep qualified people in place without performance falling apart?
Technical approach Work plan, method, process map, sample task, assumptions, risk controls, equipment list. Does the offeror understand the work and propose a realistic way to perform it?
Schedule Transition schedule, delivery timeline, critical path, milestones, dependencies, material lead times. Is the proposed schedule believable, and does it account for the actual constraints?
Past performance Recent similar contracts, CPARS (the federal performance-rating database) ratings, commercial references, problems encountered, corrective actions. How much confidence should the Government have based on what the contractor has done before?
Price Total evaluated price, option prices, unit prices, labor mix, discounts, delivery costs. Is the price fair and reasonable, and how does price compare to the value offered?

When a customer says "we should evaluate staffing," the CO needs the next sentence: "to determine whether..." That sentence is where the real evaluation factor lives.

Where past performance info comes from

If the customer wants past performance evaluated, the Government has several places to look: CPARS (Contractor Performance Assessment Reporting System) for federal contract performance ratings, commercial references the offeror provides for non-federal work, public sources like news and incident reports, and agency-specific systems for prior awards. The customer doesn't pull these — the contracting team does. But the customer can flag what kind of past performance matters most for this requirement: similar size, similar mission, similar geography, similar technical complexity.

Treat this like a prompt, not a shopping cart. Pick the few things that will actually help the Government choose between offers.

Technical capability

Can they meet the minimum need?

  • Understanding of the requirement
  • Compliance with salient characteristics
  • Proposed method or process
  • Equipment, tools, facilities, or technology

Management approach

Can they organize the work?

  • Program management structure
  • Quality control process
  • Communication plan
  • Issue escalation process

Staffing

Can they put the right people in the right seats?

  • Recruiting plan
  • Retention plan
  • Backfill timeline
  • Labor mix and coverage

Key personnel

Do named people matter to performance?

  • Project manager resume
  • Required certifications
  • Relevant experience
  • Substitution controls

Schedule and transition

Can they start, ramp, and deliver on time?

  • Transition plan
  • Milestones
  • Material lead times
  • Dependencies and assumptions

Past performance

Have they done similar work well?

  • Recency
  • Relevance
  • Quality history
  • Corrective actions

Risk

What could break performance?

  • Risk register
  • Mitigation plan
  • Safety controls
  • Continuity of operations

Cyber / IT / data

Will the approach protect systems and data?

  • CUI handling
  • ATO or access timeline
  • Privacy controls
  • Data export or migration plan

Supply chain

Can they actually get the stuff?

  • Manufacturer sources
  • Lead times
  • Obsolescence risk
  • Backup sources

Training and support

Can the Government use what it buys?

  • User training
  • Help desk support
  • Warranty response
  • Documentation quality

Small business participation

Does the team use small businesses meaningfully?

  • Subcontracting approach
  • Named small business partners
  • Workshare
  • Past small business performance

Price / cost

What will the Government pay?

  • Total evaluated price
  • Option pricing
  • Unit prices
  • Price realism when appropriate

Other possible factors or evidence areas: oral presentations, sample tasks, demonstrations, product samples, prototype results, warranty terms, delivery method, geographic coverage, response time, licenses, certifications, insurance, facility clearance, subcontractor team structure, surge capacity, maintenance plan, service level agreements, metrics dashboard, data rights, transition-out plan, closeout support, environmental approach, energy use, reliability, maintainability, and accessibility.

06How to write the evaluation sentence

Use this formula. It keeps the customer focused on the decision, not just the document name.

Evaluation sentence formula: The Government will evaluate [what the offeror submits] to determine whether [the future-risk question], including [specific evidence or considerations].
Weak

The Government will evaluate the offeror's staffing plan.

Usable

The Government will evaluate the staffing plan to determine whether the offeror can recruit, retain, and replace qualified personnel for high-turnover positions without interrupting service, including proposed recruiting sources, retention methods, vacancy backfill timelines, and supervisor coverage.

Weak

The Government will evaluate technical approach.

Usable

The Government will evaluate the technical approach to determine whether the offeror demonstrates a clear understanding of the requirement, identifies major performance risks, and proposes realistic controls for the work environment, schedule, and required outputs.

Weak

The Government will evaluate schedule.

Usable

The Government will evaluate the proposed transition schedule to determine whether the offeror can begin full performance by the required date, including site access, onboarding, material lead times, training, Government approvals, and risk to continuity of service.

07Staffing plan example

Staffing is a great example because it shows why the noun is never enough.

If you say... Contractors may submit... Better customer question
Staffing plan A generic org chart and "we will hire qualified people." How will they keep mission-critical seats filled in a labor market where these jobs turn over?
Key personnel Resumes that look impressive but don't map to the actual work. Which roles genuinely drive performance, and what qualifications prove those people can do the job?
Recruiting Broad claims about national recruiting capability. Where will they find qualified people locally, how fast can they onboard them, and what is the backup plan?
Retention "We value our employees." What specific retention methods reduce turnover for these positions: pay strategy, career path, supervision, scheduling, benefits, training, or incentives?

If turnover has hurt the mission before, say that. If bad supervision has been the problem, say that. If clearances, certifications, or shift coverage are the hard part, say that. Evaluation factors are stronger when they are built around the failure modes the customer has actually seen.

08How you want to evaluate it

When you propose a factor, also explain how you think evaluators should look at it. The CO will turn that into the formal evaluation plan; you're providing the practical customer judgment the evaluator will need later.

Compare against the requirement. Does the proposal meet the minimum need, or does it leave gaps?
Compare among offerors. Which proposal gives the Government more confidence, less risk, faster delivery, better continuity, or better support?
Look for risk. What assumptions, missing details, vague claims, or unrealistic schedules make performance less likely?
Look for proof. Did the offeror give names, dates, methods, metrics, examples, or records, or just promises?

Example: "I want to evaluate the staffing plan to determine whether the offeror has a realistic plan for high-turnover positions. I would look for recruiting sources, retention methods, supervisor coverage, how fast they backfill vacancies, and whether the plan accounts for night shifts and credential requirements."

09Evaluator mindset

An evaluation team's job is to evaluate proposals against the stated criteria. Personal knowledge of vendors, hallway opinions, and incumbent familiarity don't count. The proposal in front of you, against the criteria announced — that's the only material.

Stay inside the criteria. If it wasn't stated as something the Government would evaluate, don't invent it during evaluation.
Use proposal evidence. Tie strengths, weaknesses, and concerns to what the offeror actually submitted.
Document the why. "Good plan" isn't enough. Explain what makes it good and why it matters to performance.
Be consistent. If the same issue appears in two proposals, treat it the same way unless there's a documented reason not to.
Avoid vibes. Personal knowledge, rumors, hallway opinions, and incumbent familiarity aren't evaluation criteria.

The vocabulary you'll hear. Once you're on an evaluation team, you'll hear specific words that have specific meanings:

  • Strength. A feature of the proposal that exceeds the minimum need or reduces risk to the Government.
  • Weakness. A flaw or shortcoming that increases risk but doesn't make the proposal unawardable.
  • Significant weakness. A more material flaw that affects the proposal's overall quality.
  • Deficiency. A material failure to meet a Government requirement, or a combination of weaknesses serious enough to make the proposal unacceptable as written.
  • Risk. The probability and consequence of something going wrong with the offered approach.

Use them precisely. "Good plan" isn't an evaluation note. "Strength: the proposed staffing approach exceeds the minimum supervisor coverage by including night-shift backfill, reducing service-interruption risk" is.

Source selection sensitivity

Once you join an evaluation team, you handle source-selection-sensitive information: proposals, evaluator notes, ratings, comparisons, and the eventual recommendation. That information has special handling rules. Don't share proposals with anyone outside the team. Don't tell a vendor — including a favorite incumbent — anything about how their proposal compares. Don't use information from one proposal to help shape another.

Slips here disqualify offerors, taint awards, and end careers. If you're unsure whether you can share something, the answer is probably no. Ask the contracting officer.

The trap

Every factor you add creates work for industry and the Government. If a factor won't help decide who should win, leave it out. Three sharp factors beat eight vague ones every time.

10What to send contracting

The list of nouns is the easy part. What contracting actually needs is the decision logic behind them — what would make one offer meaningfully better, what evidence proves the difference, and what failure modes the customer has seen before.

What matters most? Explain the few things that would make one offer meaningfully better than another.
What evidence should offerors submit? Plans, resumes, schedules, sample tasks, references, technical data, certifications, demonstrations, or other proof.
What are you trying to determine? Write the "to determine whether..." sentence for each proposed factor.
What failure modes are you worried about? Turn past pain into evaluation design: turnover, late delivery, weak supervision, bad transition, supply delays, poor data, or missed response times.
How important is it compared to price? Tell the CO whether the mission would pay more for a meaningfully better approach, or whether minimum acceptable plus lowest price is enough.
Next step

For each factor you want, finish this sentence before you send the package: "I want to evaluate this because the Government needs to know whether..." If you can't finish the sentence, the factor may not belong in the solicitation.