Topic I-01 — Negotiated Acquisitions

Preparing an RFP: C-Type Contracts & Part 15 Source Selection

What a C-type contract means, how an RFP is structured, and why Sections L and M are the most important documents you will write for a competitive acquisition.

1 C-Type Contracts: What Your Record Is Telling You

Every federal contract gets a Procurement Instrument Identification Number (PIID). One character in that number identifies the instrument type. When you see a C in that position, you are looking at a definitive contract, as opposed to a purchase order (P), a delivery order (D), a task order (J), or a BPA call (A).

That distinction matters because instrument type reflects the acquisition method used to award the contract. Purchase orders are the product of simplified acquisition procedures. Definitive contracts are typically the product of formal competitive source selection under FAR Part 15, or a sole-source justification and approval. The threshold where simplified acquisition ends and formal Part 15 begins depends on what you are buying.

Acquisition ScenarioThresholdTypical Instrument
Commercial items (most agencies)At or below $350,000 SAT
Up to $7.5M under FAR 13.5
Purchase Order
Commercial items (DoD)Up to $9M under enhanced procedures (10 U.S.C. 3305)Purchase Order
Non-commercial, complex requirementsAbove the applicable simplified acquisition thresholdC-type Contract (FAR Part 15)
Sole source (any amount)J&A required; various statutory limits applyC-type Contract

When you are handed a C-type acquisition to work, you are in FAR Part 15 territory. That means a formal Request for Proposals (RFP), a structured evaluation, a Source Selection Authority, and a documented best-value determination. The RFP is the document that sets the rules for the entire competition.

The practical takeaway: If your acquisition record shows a "C" instrument type and the contract is competitive, an RFP was used. If you are starting a new competitive acquisition above the simplified acquisition thresholds, an RFP is what you are building toward.

2 The RFP vs. the Combined Synopsis/Solicitation

Both the RFP and the Combined Synopsis/Solicitation (CSS) are solicitations. Both are posted on SAM.gov. Both invite offerors to submit proposals. But they operate under different FAR authorities and have fundamentally different structures.

Combined Synopsis/SolicitationRequest for Proposals (RFP)
FAR AuthorityFAR 12.603 (commercial items), FAR 13FAR 15.203–15.210
FormatSimplified; no formal UCF requiredUniform Contract Format (UCF), Sections A–M
Evaluation L and M sectionsNot required; evaluation often price-focusedRequired; Section L and M are the core of the solicitation
NegotiationTypically award on initial offersMay establish a competitive range, conduct discussions, request final proposal revisions
Typical useCommercial, well-defined, lower-complexity buysComplex services, R&D, system development, high-value acquisitions

The RFP is the right tool when the government needs to make a judgment call about the quality of proposals, not just the price. It gives you the structure to say: here is what we will evaluate, here is how we will evaluate it, and here is what you need to tell us to get a good score.

The CSS is not a lesser version of the RFP. It is a different tool for a different job. A CSS is appropriate when requirements are clear and the main question is price. The RFP is appropriate when technical approach, management capability, or past performance meaningfully differentiate offerors.

3 The Uniform Contract Format (UCF)

FAR 15.204-1 requires RFPs to use the Uniform Contract Format. The UCF divides the document into four parts and thirteen lettered sections. Most sections will eventually become the contract itself. Sections L and M exist only in the solicitation phase. Once award is made, they drop out.

SectionTitlePart
ASolicitation/Contract Form (SF 26 or SF 33)I — The Schedule
BSupplies or Services and Prices/CostsI
CDescription/Specifications/Statement of WorkI
DPackaging and MarkingI
EInspection and AcceptanceI
FDeliverables or PerformanceI
GContract Administration DataII — Contract Clauses
HSpecial Contract RequirementsII
IContract Clauses (FAR/DFARS prescriptions)II
JList of Attachments (PWS, CDRL, etc.)III — List of Documents
KRepresentations, Certifications, and StatementsIV — Representations and Instructions
LInstructions, Conditions, and Notices to OfferorsIV — Solicitation Only
MEvaluation Factors for AwardIV — Solicitation Only

Sections A through J form the contract. Section K covers offeror representations. Sections L and M are where the competition happens. Section L tells offerors how to respond. Section M tells both offerors and evaluators how the government will decide. If you write these two sections well, the rest of the evaluation process runs smoother.


4 Section L: Instructions to Offerors

Section L is the offeror's blueprint for building a proposal. It tells them what volumes to submit, how many pages each volume can be, what format to use, and exactly what to address within each evaluation factor. A well-written Section L eliminates ambiguity and makes proposals easier to evaluate because every offeror is responding to the same specific questions.

Section L typically contains:

  • L.1 — Proposal Organization Requirements: defines the volume structure, page limits, font, margins, and how to submit (SAM.gov, encrypted email, physical delivery)
  • L.2 — Proposal Instructions by Volume: walks through each volume and tells offerors exactly what to address for each evaluation factor
  • L.3 — Questions and Clarifications: deadline for questions, the no-contact-with-agency rule, and when the government will issue answers as an amendment
  • L.4 — Proposal Preparation Costs: standard clause reminding offerors the government will not reimburse proposal costs
  • L.5 — Late Proposals: reference to the late proposal rule (FAR 15.208)

The instructions for each volume should be specific enough that an offeror who follows them exactly will give evaluators everything they need to score the proposal. Vague instructions like "describe your technical approach" produce vague proposals and inconsistent evaluations.

Section L.2(a) — Volume I: Technical Approach (Example Language)

L.2(a)(1) Factor 1 — Technical Approach. Offerors shall describe their approach to performing all requirements in the Performance Work Statement (PWS). At minimum, the response shall address: (1) the methodology for performing recurring service calls and preventive maintenance tasks as described in PWS Sections 4.1 through 4.6; (2) the approach to maintaining the required 98% network uptime SLA, including contingency procedures; and (3) the transition-in plan for assuming responsibility no later than Day 30 of the performance period. be specific

L.2(a)(2) Factor 2 — Management Approach. Offerors shall describe: (1) the organizational structure for contract performance; (2) the qualifications and roles of the proposed Program Manager and Lead Technician (résumés shall be included as Attachment A to Volume I, not counted against the page limit); and (3) the quality control plan for ensuring performance standards are consistently met. tie to the PWS

Volume I shall not exceed 20 pages, excluding résumés. Pages shall be 8.5" x 11", single-spaced, 12-point Times New Roman or equivalent, with 1-inch margins on all sides.

Write Section M before Section L. Know exactly what you are going to evaluate and how you will score it before you tell offerors what to submit. If you write L first, you may end up asking for information that has no corresponding evaluation criterion, which puts you in a bad position later.

5 Section M: Evaluation Factors for Award

Section M is the government's commitment to offerors about how proposals will be judged. It lists every evaluation factor, states how factors compare to one another, defines the rating scale, and describes the basis for award. Offerors read Section M to understand what will win. Evaluators use it as their authority for every score they assign. The Source Selection Authority uses it to document the best-value tradeoff.

Section M typically contains:

  • M.1 — Basis for Award: the best-value or LPTA statement
  • M.2 — Evaluation Factors: factors listed in descending order of importance
  • M.3 — Relative Importance: explicit statement of how factors compare to one another
  • M.4 — Evaluation Criteria: what evaluators will look for within each factor
  • M.5 — Rating Definitions: the adjectival or color-coded scale the evaluation team will use

The adjectival rating scale most common in DoD formal source selections uses five ratings for non-price technical factors and a separate scale for past performance:

Technical/Management FactorsDefinition
OutstandingProposal exceeds requirements in a way that significantly benefits the government. Exceptional strengths outweigh any weaknesses. Very low risk of unsuccessful performance.
GoodProposal exceeds some requirements. Strengths outweigh weaknesses. Low risk of unsuccessful performance.
AcceptableProposal meets requirements. Strengths and weaknesses are offsetting. Moderate risk of unsuccessful performance.
MarginalProposal fails to meet some requirements but deficiencies are correctable. High risk of unsuccessful performance.
UnacceptableProposal fails to meet requirements. Deficiencies are not correctable or would require a major rewrite. Proposal cannot be awarded.
Past Performance FactorDefinition
Substantial ConfidenceBased on the offeror's performance record, the government has high confidence the offeror will successfully perform the required effort.
Satisfactory ConfidenceBased on the offeror's performance record, the government has reasonable confidence the offeror will successfully perform.
NeutralNo relevant past performance history exists. Neutral is not a negative rating.
Limited ConfidencePerformance record indicates reasonable doubt about the offeror's ability to successfully perform.
No ConfidencePerformance record gives the government no confidence the offeror will successfully perform.
Price is always evaluated. Even in a best-value tradeoff where Technical is the most important factor, price is still evaluated and documented. The question is not whether price counts, but whether you can pay more for a technically superior offeror and justify it. In LPTA, price is the deciding factor once technical acceptability is confirmed.

6 L and M Must Align Exactly

This is the rule that causes the most problems for new COs and is a recurring finding in protests. Every factor listed in Section M must have corresponding instructions in Section L. Every set of instructions in Section L must correspond to something in Section M. If you ask offerors to address something in their proposal but do not have an evaluation factor for it, you cannot use it to distinguish proposals. If you have an evaluation factor but gave offerors no instructions on how to address it, the resulting proposals will be inconsistent and your evaluation record will be weak.

The alignment check is simple in theory:

  • Take every factor and subfactor from Section M. For each one, find the corresponding instruction in Section L. If there is no instruction, add one or remove the factor.
  • Take every instruction from Section L. For each one, find the corresponding factor in Section M. If there is no factor, remove the instruction or add a factor.
  • Check that page limits in Section L are roughly proportional to factor weight in Section M. Giving a factor 30 pages of instruction but treating it as less important than a factor with 5 pages of instruction sends a contradictory message.
Misalignment is a protest vulnerability. GAO has sustained protests where the government evaluated proposals on criteria that were not disclosed in Section M, or used Section L instructions as evaluation criteria that were not in Section M. The offeror's right to know exactly how they will be evaluated is a fundamental fairness principle.

The simplest alignment check: sit down with Section L in one hand and Section M in the other and walk through both simultaneously before releasing the RFP. This is also when your legal counsel review is most valuable.


7 The Three Source Selection Methods

As of November 2025, FAR Part 15 now describes three source selection methods under the Revolutionary FAR Overhaul (FAR class deviation RFO-2025-15, effective November 3, 2025). The choice belongs in your acquisition plan and shapes every decision about Section L and M.

Best Value Tradeoff (FAR 15.101-1) allows the government to pay more for a technically superior offeror if the SSA documents that the additional value justifies the additional cost. Technical factors, past performance, and other non-price factors are evaluated on an adjectival scale. After evaluation, the SSA makes a tradeoff decision: is the higher-rated, higher-priced offer worth it? A well-documented tradeoff can withstand protest scrutiny. An undocumented one usually cannot.

Lowest Price Technically Acceptable (LPTA, FAR 15.101-2) awards to the lowest-priced offeror whose technical proposal passes a pass/fail threshold. There is no tradeoff. A technically acceptable offeror with a slightly better proposal gets nothing for that superiority. LPTA is appropriate when the requirement is well-defined, performance risk is low, and technical differentiation adds no real value to the government. DoD has statutory restrictions on LPTA use (10 U.S.C. 3241) — it may not be used for information technology, systems development, complex services, or when technical merit is expected to affect program outcomes.

Highest Technically Rated with a Fair and Reasonable Price (FAR 15.103-3) is a new method added by the RFO. It is appropriate when the government determines in advance that it would not be advantageous to consider tradeoffs between price and non-price factors, and that the acquisition warrants paying any fair and reasonable price to get the highest-quality performance. In other words: you want the best, and you will pay for it as long as the price is not unreasonable.

The process works sequentially. Proposals are evaluated on technical factors only. The SSA identifies the highest-rated proposal. That proposal's price is then evaluated for fair and reasonable determination under FAR Subpart 15.4. If the price is fair and reasonable, award is made. If not, the SSA moves to the next-highest-rated proposal and repeats the price check. Price is never a tradeoff factor — it is only a ceiling check.

Section M.1 Example — Highest Technically Rated
Award will be made to the responsible offeror submitting the highest technically rated proposal that offers a fair and reasonable price. The Government will not consider tradeoffs between price and non-price factors. Price will not be ranked or compared among offerors. The highest technically rated offeror's price will be evaluated for fair and reasonable determination in accordance with FAR Subpart 15.4. If the price of the highest technically rated proposal is not fair and reasonable, the Government will evaluate the price of the next highest technically rated proposal, and so on, until a fair and reasonable price is established.

This method suits acquisitions where the government genuinely needs the best available capability and cannot accept a "good enough" solution — highly specialized research, critical system development, or advisory services where the top-performing offeror provides qualitatively different outcomes. It is also useful where price spread among qualified offerors is expected to be narrow, making a tradeoff analysis a lower-value exercise.

HRO vs. BVT: the key distinction. In best value tradeoff, the SSA can award to a lower-rated offeror if the price difference is not worth the technical gap. In HRO, that is not possible — the highest-rated proposal wins as long as it is not unreasonably priced. If you want the absolute best technical solution and are willing to pay a fair price for it without second-guessing whether a slightly cheaper option is "close enough," HRO is the right tool. The RFP must explicitly tell offerors which method is being used.
Choosing the Right Method BVT: Technical quality matters and you want the ability to make a judgment call — paying more for better, or accepting less-than-best for significant savings. Most complex services contracts.

LPTA: The requirement is well-defined, technically acceptable is genuinely good enough, and price is the differentiator. Well-specified commodity-like services.

HRO: You need the highest-rated performer, full stop. Price only needs to be reasonable — you are not trying to find a "good enough at a better price" option. Highly specialized or critical capability acquisitions.

8 How Detailed Should Your Section M Criteria Be?

FAR does not mandate how descriptive your evaluation criteria must be. FAR 15.304 requires that the RFP "clearly" identify all evaluation factors and significant subfactors and state their relative importance. The RFO adds that factors must "represent key areas of importance" supporting "meaningful differentiation between competing proposals." Beyond that, how much you spell out within each factor is a judgment call, and COs approach it differently.

Two common styles:

Minimalist: The factor name and subject matter, with little or no description of what evaluators will look for. Section M identifies what the government will evaluate; the detail lives in the SSP (which is internal and not released to offerors).

Minimalist Section M Example
Factor 1 — Technical Approach (Most Important)
Subfactor 1a: Staffing Plan
Subfactor 1b: Quality Control Plan
Subfactor 1c: Transition-In Plan

Descriptive: Each factor or subfactor includes a sentence or two explaining what evaluators will assess and what "good" looks like. Offerors know exactly what to address and evaluators have explicit written standards to apply.

Descriptive Section M Example
Factor 1 — Technical Approach (Most Important)
Subfactor 1a: Staffing Plan. The Government will evaluate the offeror's staffing plan to determine whether adequate staffing levels, appropriate labor categories, and realistic surge capacity are proposed to meet all PWS requirements without creating schedule risk.

Subfactor 1b: Quality Control Plan. The Government will evaluate the offeror's quality control plan to determine whether it includes specific inspection procedures, corrective action processes, and metrics sufficient to ensure consistent compliance with performance standards.

Subfactor 1c: Transition-In Plan. The Government will evaluate whether the proposed transition-in plan demonstrates a realistic, milestone-based approach to achieving full operational capability by the required date.

Neither style is required by FAR. Both are legally defensible if consistent with the Section L instructions and the SSP. The practical tradeoffs:

MinimalistDescriptive
Offeror guidanceLess; offerors may propose inconsistentlyMore; offerors know exactly what to address
Evaluator flexibilityMore; evaluation criteria defined internallyLess; evaluators are constrained by what M says
Protest exposureHigher if evaluators expand beyond what M saysLower; written criteria are auditable
SSP burdenHigher; more detail must live in the internal SSPLower; M itself serves as the evaluation guide
The protest risk cuts both ways. If your Section M is vague, you risk a protest arguing the government evaluated proposals on unstated criteria. If your Section M is very prescriptive, you risk a protest arguing the government deviated from stated criteria because the evaluation results do not track the published language word-for-word. The standard is that the government must evaluate what it said it would evaluate, and nothing else.

For most services contracts, moderate description is the practical middle ground: name the factor, briefly state what will be assessed, and let the SSP handle the detailed scoring rubric. If your agency has a standard Section M template, use it — consistency across your office's solicitations reduces protest surface and makes evaluator training easier.


9 Practical Notes for New COs

Start with Section M. Write your evaluation criteria before you write your proposal instructions. This forces clarity about what actually matters and prevents you from soliciting information you have no plan to evaluate.

Tie Section L instructions directly to the PWS. Reference specific PWS sections in your proposal instructions. If an offeror needs to tell you how they will meet the 98% uptime SLA in PWS Section 4.3, say so explicitly. Generic instructions produce generic proposals.

Avoid using percentages for factor weight. FAR requires you to state relative importance using comparative language ("Technical is significantly more important than price") not percentages. Point systems are permitted but require careful design. Percentages invite mathematical manipulation of proposals and can restrict the SSA's judgment in a tradeoff.

The Source Selection Plan must match Section M. Before releasing the RFP, your Source Selection Plan (SSP) should define the evaluation process. The SSP is internal, but the rating criteria in it must align with what Section M says. Evaluators score against the SSP; offerors propose against Section M. They have to describe the same thing.

Set realistic page limits. Page limits that are too tight produce proposals that can't adequately address the requirement. Too loose and you create review burden without value. A rough guide: 20–30 pages for technical on a mid-complexity services contract, 10–15 for management, and a separate volume for past performance references (usually 3–5 relevant contracts with points of contact).

Get legal review before release. Your procurement attorney or legal counsel should review the final RFP, specifically Sections L and M, before it posts. This is the moment when misalignment, ambiguous factor definitions, or LPTA/HRO appropriateness questions are cheapest to fix. After release, an amendment is required to make changes, and late amendments compress offeror response time.

Each scenario below gives you an acquisition description. Work through the two evaluation design decisions, then generate your Section M framework and see what the corresponding Section L structure looks like. There is more than one defensible answer for most acquisitions. The feedback explains the reasoning, not the score.

Scenario

FAR 15.2 — Solicitation & Receipt of Proposals

Covers the RFP requirement, content, amendment procedures, pre-solicitation notices, and late proposals.

Open FAR 15.2 →

FAR 15.204-1 — Uniform Contract Format

The authoritative list of UCF sections A through M and what goes in each one.

Open FAR 15.204-1 →

FAR 15.304 — Evaluation Factors

Requirements for listing factors, stating relative importance, and including price. The Section M foundation.

Open FAR 15.304 →

FAR 15.305 — Proposal Evaluation

How to conduct the evaluation, past performance assessment rules, and what goes in the evaluation record.

Open FAR 15.305 →

FAR 15.101 — Source Selection Approaches

Best value tradeoff (15.101-1) and LPTA (15.101-2). For the new Highest Technically Rated method, see FAR 15.103-3 (RFO class deviation).

Open FAR 15.101 →

FAR 15.103-3 — Highest Technically Rated (RFO)

New source selection method added by the RFO class deviation (effective Nov 3, 2025). Award to the highest-rated technical proposal that offers a fair and reasonable price.

RFO Part 15 Guide →

FAR 15.3 — Source Selection

The full source selection process: source selection authority, evaluation team, competitive range, discussions, and best value determination.

Open FAR 15.3 →

DFARS 215.3 — DoD Source Selection

DoD-specific source selection procedures, including restrictions on LPTA use and SSA designation requirements.

Open DFARS 215.3 →

SAM.gov

Where RFPs are posted. Solicitations must be synopsized and the full solicitation package uploaded here.

Open SAM.gov →