The solicitation method that lets you describe a problem, let industry propose the solution, and evaluate on peer review. This module has four parts: Concepts (what a CSO is and when it fits), Execution (how to stand one up and run it), What it looks like (narrative walkthrough), and Sample language (what the actual paperwork says).
Imagine your customer walks into your office and says: "My team at the forward site can't get internet. The commercial carriers don't serve that area. Fix it." They don't know what the answer is. They just know the outcome they need.
You suspect industry has options you haven't thought of. Maybe it's low-earth-orbit satellite. Maybe fixed-wireless from a nearby tower. Maybe a mesh network. Maybe something you've never heard of. You don't want vendors racing to deliver an identical spec at the lowest price. You want them competing on which approach actually solves the problem.
That shape of problem is where CSO earns its keep. Think of it like hiring a plumber. If you say "I want my plumbing fixed and staying fixed for five years," you might get three proposals: one that reroutes the pipes, one that replaces the main line, one that packages the fix with a maintenance contract. All three can be valid. Your job is to evaluate which fits your budget and constraints, not to force all three into the same mold before they ever come in.
CSO gives you solution-space freedom. It does not give you capability beyond what your Area of Interest (AOI) describes. If you write the AOI tightly around "gasoline sedan that gets 42 mpg," vendors will come back with gasoline sedans. If you write it as "a daily-driver vehicle that averages 42 mpg or better," a hybrid becomes a valid answer. The AOI language does the heavy lifting.
An AOI is one specific problem or capability gap, written with enough detail that industry can propose solutions against it. Every CSO contains one or more AOIs. Each AOI has its own problem statement, its own evaluation criteria, its own submission window, and its own award decision.
When you hear someone say "writing a CSO," what they are almost always doing is writing AOIs. The umbrella announcement is short and stable. The AOIs are where the craft lives, and where the procurement succeeds or fails.
Two lines to anchor on:
A CSO is a solicitation method, not a contract type.
A CSO is a persistent acquisition capability, not a transaction.
The CSO itself is the announcement. You publish it, and it sits on the virtual shelf. Inside the CSO, you post Areas of Interest (AOIs). Each AOI describes a problem or capability gap in plain-enough language that industry can propose solutions against it. Vendors submit proposals. You evaluate them through peer review, usually with subject-matter experts. You award using a FAR contract or, post-FY26 NDAA, an OT agreement.
When an AOI closes or is satisfied, the CSO itself keeps running. You can post new AOIs as new needs appear. That persistence is what separates a CSO from a one-off buy: you're standing up a repeatable problem intake, not running a single transaction.
Persistent posting on SAM.gov. States authority, scope, evaluation approach, submission process. Stays live for the fiscal year or longer and changes rarely.
Forward-site connectivity. Own problem statement. Own due date. Own peer review and award.
Counter-UAS detection at the fence line. A different problem. Evaluated separately.
Posted two months later when a new need surfaces. Umbrella never had to change.
One umbrella, many AOIs. The umbrella is stable; the AOIs come and go as problems surface. Each AOI is its own tiny competition.
The idea of "here's the problem, propose your solution" did not arrive with CSO. It's been sitting in the FAR for decades.
What CSO added was scaffolding. A named solicitation method. The AOI structure. Rolling intake. Peer review as the defined competition mechanism. A statutory pathway to sole-source follow-on production. If you've ever written a Part 12 statement of objectives and let vendors propose how to meet it, you already operate with the CSO instinct. The vehicle gave that instinct a formal home with scaling features built in.
10 USC §3458 is the authorizing statute. It authorizes DOD to acquire commercial products and services through a competitive selection of proposals resulting from a general solicitation, evaluated by peer review, technical review, or operational review (as appropriate).
DFARS Subpart 212.70 implements the statute for DoD use, including procedural guidance on CSO establishment, AOI publication, and award.
FY26 NDAA Section 1823 (signed into law December 18, 2025) amended 10 USC §3458 substantially. If you learned CSO before December 2025, several things you remember are now obsolete. See the next section.
Four changes worth knowing before you write your first CSO under the new rules.
Three moving parts:
AOI granularity is a judgment call, and the failure modes on either side are worth understanding.
Too narrow means you've pre-picked the solution in your own language. If the AOI reads "Cisco Catalyst 9300, 48-port, PoE+, part number WS-C9300-48P," every vendor comes back quoting the same item. The solution space is zero. The CSO machinery (AOI publication, rolling intake, peer review panels, CSO umbrella administration) is doing no useful work because there's nothing for vendors to be creative about. A Part 12 commercial buy would have gotten you there faster and with less overhead. You're also drifting from the purpose of the authority: CSO exists for problem-statement procurement, and "I know the exact part number" is not a CSO-shaped problem.
Too broad means peer reviewers are asked to compare things that aren't comparable. An AOI like "solutions for base IT challenges" might bring in a cloud migration at $4M, an endpoint-management tool at $400K, and a help-desk staffing contract at $2M a year. Those proposals aren't competing against each other. They're answering different questions. Every evaluation criterion you try to apply turns into an apples-to-oranges judgment, and the competition CICA requires starts to look hollow on paper. Protest exposure grows because a losing vendor can credibly argue their proposal was never fairly compared to what won.
The sweet spot is an AOI narrow enough that every proposal you expect to get is recognizably answering the same problem, and broad enough that vendors can propose meaningfully different technical approaches to it. A working stress test: can you write evaluation criteria that apply consistently to every plausible proposal? If you find yourself writing criteria that only apply to some, the AOI is too broad and needs to be split into separate AOIs under the CSO umbrella.
CSO is a great fit for some problems and a forced fit for others. Check yourself against both sides.
A few authorities can look like CSO at a glance but are legally distinct. Keeping them separate in your head will save you arguments with your counsel later.
| Authority | How it compares to CSO |
|---|---|
| FAR Part 12 (Commercial) | Same commercial space as CSO, but Part 12 is a transactional acquisition against a known requirement. CSO is a persistent capability with AOIs against a described outcome. |
| FAR Part 13 (Simplified, post-RFO) | Part 13 now handles non-commercial simplified buys under the SAT. It does not give you rolling intake or peer-review evaluation. Commercial sub-SAT buys moved to Part 12 under the Revolutionary FAR Overhaul. |
| FAR Part 15 (Negotiated) | You can run outcome-based procurements under Part 15, but the proposal and evaluation overhead is significantly heavier than a CSO and there's no persistent shelf. Reserve Part 15 for one-time procurements where the formal negotiation framework pays off. |
CICA requires competition. CSO satisfies CICA because the general solicitation paired with peer review is the competitive mechanism. You're running competition differently, not skipping it.
CSO awards are protestable at both GAO and the Court of Federal Claims. Practitioners sometimes assume that "alternative authority" means protest-free, and that's not the case here. Your evaluation documentation, source selection decision, and debriefings should carry the same rigor as any FAR-protestable award.
Pre-FY26: fixed-price or fixed-price incentive fee under a FAR contract.
Post-FY26: all of the above, plus OT agreements.
Cost-reimbursement is not authorized under CSO. If your requirement can't be structured as fixed-price, CSO isn't your vehicle.
The OT option is powerful, but it comes with a different rulebook than a FAR contract. OTs are not subject to the FAR, which means IP and data rights are negotiable territory, protest exposure is narrower, and the clauses you attach are the clauses you and the vendor agree to (not a standard FAR clause set). If you award a CSO as an OT, know what you're signing up for. We'll walk that decision in the Execution module.
CSO is authority-agnostic on color of money. 3600 RDT&E is the most common you'll see, because the original "innovative" scope attracted R&D-flavored requirements, but CSO-funded work shows up under 3010 Aircraft Procurement, 3020 Missile Procurement, 3400 O&M, SBIR/STTR set-asides, working capital funds, and more.
The rule of thumb: let the requirement drive the color of money, not the authority. If you find yourself funding something with RDT&E because "we're using CSO so it must be RDT&E," stop and check fiscal law. Post-FY26, CSO explicitly covers commercial products and non-developmental items that are nowhere near R&D, so the authority no longer implies a color of money at all.
Concepts gets you to the point of saying "yes, this looks like a CSO-shaped problem." Execution picks up from there: standing up a CSO in practice, writing AOIs that are neither too narrow nor too broad, running peer review, documenting the Adequate Price Competition determination, building the price-negotiation memorandum for an APC-based award, and deciding whether to award as a FAR contract or an OT.
Standing up the CSO, writing AOIs, running peer review, APC-based PNM, and the FAR-vs-OT award decision. If you'd rather see one happen end-to-end before the mechanics, the What it looks like tab tells the whole thing as a story.
An AOI (Area of Interest) is one specific problem or capability gap posted under the CSO umbrella. Each AOI has its own problem statement, its own evaluation criteria, its own submission window, and its own award decision. The umbrella is the storefront; AOIs are the actual asks.
Anywhere in this tab you see "write the AOI," "evaluate against the AOI," "AOI misscoped," "AOI follow-on," that is the unit being referred to. If any of this is unfamiliar, the Concepts tab covers what a CSO is and how AOIs sit inside it.
Before writing the announcement, nail down a few things with the program office and the other stakeholders. These conversations save weeks of rework later.
A CSO that hasn't been scoped up front will be scoped for you by whichever vendor proposal lands first.
Items to agree on before publishing:
Write the problem statement in plain language first, before any FAR or OT vocabulary. If you cannot explain what you are solving to someone outside the program office in three sentences, the AOI will bleed that same confusion onto vendors and they will all propose different things.
The umbrella is the parent announcement that lives on SAM.gov. Individual AOIs live under it. The umbrella says "this is a CSO, here is who we are, here is the general focus area, here is how submissions work." The AOIs do the real work of soliciting specific proposals.
The umbrella is the storefront. The AOIs are the specific things for sale.
What the umbrella announcement usually contains:
Publication venue. SAM.gov is the default and is what establishes the formal competitive solicitation. Some organizations also publish on an agency innovation portal to drive awareness, but the SAM.gov posting is the one that matters for the record.
Intake rhythm. AOIs can open and close on their own cadence under the umbrella. Rolling intake (continuously open) is common. Wave intake (quarterly windows) is a useful structure if evaluator bandwidth is tight or if the program office wants to do a batched comparison across several AOIs.
If you find yourself amending the umbrella every month, the umbrella is doing too much work and should be split into two umbrellas, or policy that belongs in individual AOIs has crept into the umbrella. The umbrella should read roughly the same at the end of the fiscal year as it did at the start.
The AOI is where most CSOs succeed or fail. A working AOI describes a problem cleanly, sets desired outcomes as floors (not ceilings), and gives peer reviewers a usable rubric.
Anatomy of a working AOI:
Common AOI failure modes:
Concepts section 6 covers AOI granularity in depth (too narrow, too broad, sweet spot). If you have not read that, read it before drafting your first AOI. The granularity calls are the ones that come up repeatedly.
Peer review is the CSO-specific evaluation method. It replaces the traditional FAR Part 15 Source Selection Evaluation Board (SSEB) structure with a lighter process built around subject-matter experts answering a narrower question.
Peer review is not a tradeoff analysis. Reviewers are answering whether this proposal, on its merits, meets the AOI to a degree that warrants award.
Panel composition. Typical panels are three reviewers per proposal. At least one technical SME and at least one user or operational representative. The contracting officer runs the process but does not vote on technical merit. Conflicts of interest are documented in writing. Reviewer identities are not shared with offerors (non-attribution is the standard).
Process. Each reviewer reads the proposal independently and completes a structured template (strengths, weaknesses, fit to AOI, overall recommendation). The panel then convenes for consensus or majority discussion. The CO documents the decision and any minority views.
This is the line that separates peer review from FAR Part 15 discussions. Cross it deliberately or not at all.
Drifting across this line without calling it out in the file is how a defensible CSO award becomes a contested one. If the panel finds itself asking for changes rather than asking for clarification, pause, talk to the CO, and make the conversion explicit in the record.
When to close without award:
Document these decisions the same way you would document any non-award. They will get questioned.
Non-attribution is a feature of peer review, not a formality. Reviewers give franker assessments when offerors will not see who said what. Keep the individual reviewer templates inside the award file but do not release them outside the panel and the CO's record.
Worth setting the baseline first: CSO awards are commercial products, commercial services, or nondevelopmental items, so certified cost or pricing data is already off the table under FAR 15.403-1(b)(3). TINA thresholds are not the question here. The question APC answers is narrower and practical: does the competition itself carry the price-reasonableness determination, or does the CO have to build that determination from other data?
FAR Part 15 citations appear here because that is where the price analysis techniques live (APC at 15.404-1(b)(2), the commercial-item exemption at 15.403-1(b)(3), data other than certified cost or pricing data at 15.403-3). Part 12 commercial procedures point back to those techniques rather than duplicating them. Citing 15.404 for APC does not convert a CSO award into a Part 15 source selection. The chassis is still Part 12. The evaluation method is still peer review. Part 15 is being used as a toolbox here, not as the blueprint for how to run the procurement.
APC (FAR 15.404-1(b)(2)) is the condition that lets CSO pricing rest on the competition itself. If you have APC, the competed prices are the evidence of reasonableness and the pricing file is correspondingly light. If you do not have APC, you are running a price analysis using catalog prices, similar commercial sales, prior competitive buys, independent estimates, or information the offerors provide under FAR 15.403-3, and the pricing file gets heavier in proportion.
The five conditions (paraphrased from the FAR):
When APC works cleanly:
When APC breaks down:
Fallbacks when APC does not apply:
The APC determination is the backbone of the pricing file. Document it explicitly against this procurement's facts rather than by boilerplate.
A CSO awarded on APC still gets a Price Negotiation Memorandum, but it reads differently from a traditional negotiated-procurement PNM.
Standard sections for an APC-based PNM:
Common PNM mistakes on a CSO award:
Cross-reference existing PNM training on the site for general structure. The document layout is the same. What differs here is the basis (peer review plus APC) rather than the form.
Post FY26, the CSO authority lets you award as a FAR contract or as an Other Transaction. Same solicitation, different award vehicle. The decision usually lives after proposals come in rather than before, because the right answer can depend on what is offered and who is offering it.
OT tradeoffs to weigh:
The meta point: authority to procure is CSO in both cases. The vehicle is a separate decision that follows the competition outcome rather than defining it.
Once peer review concludes and pricing is determined, award mechanics depend on the vehicle.
FAR contract path:
OT agreement path:
Either path: the award file contains the full peer review record, the APC determination or alternative pricing analysis, the PNM, and any clarifications correspondence.
The FY26 NDAA expanded sole-source follow-on authority under CSO. This is one of the more powerful tools in the statute and one of the most misused.
What the statute allows post FY26: a follow-on production award, sole-source, to the original CSO awardee, for commercial products, commercial services, or nondevelopmental items. The $100M approval requirement that existed in earlier versions was removed.
What the follow-on authority does not allow:
Documentation burden is still substantial:
Treating follow-on pricing as light because the original was CSO-competed. APC applied to the original procurement. The follow-on is a sole-source action and needs its own price reasonableness analysis, usually through cost analysis, prior-price comparison, or documented market data. Build the price file as if there were no competed predecessor, then reference the predecessor where it genuinely supports the analysis.
CSO awards administer like their underlying vehicle. FAR contract awards follow normal FAR administration. OT awards follow the OT agreement terms. A few CSO-specific points come up in administration.
CDRLs and deliverables. AOIs written as outcomes translate to outcome-based CDRLs. If the AOI said "capable of processing X throughput," the acceptance criterion should measure X throughput, not a proxy. Avoid sneaking spec-level requirements into CDRLs that the AOI did not contain. That is scope expansion by another name.
Performance measurement. Tie PM back to the AOI language. If the offeror proposed to deliver outcome X via approach Y, the PM approach has to let you see whether approach Y is delivering outcome X. Surveillance and Performance Monitoring (SPM) practices apply the same way they do on any contract; the content of the surveillance plan is shaped by the AOI rather than by a generic template.
Peer-reviewer feedback loop. After performance has run for a while, pull a couple of the original peer reviewers back in to look at the performance data. They will tell you whether the AOI was well-scoped in hindsight. That feedback becomes the starting point for the next AOI on a similar topic.
Commercial termination posture. If the contract has to be terminated for convenience during performance, and the underlying vehicle is a FAR commercial-item contract, the termination follows the commercial path under FAR 12.403 and the clause at 52.212-4(l) rather than running a FAR Part 49 settlement. Part 49 still exists and is referenced in 12.403, but commercial T4C does not walk through a Part 49 settlement proposal. OT terminations follow the OT agreement's own termination terms.
Protest posture differs materially by vehicle.
FAR contract CSO awards are protestable at GAO and at the Court of Federal Claims under the normal rules for commercial competitive procurements. The peer review process does not exempt the award from protest, and the peer review record will be part of the agency report if a protest is filed.
OT awards from a CSO have a narrower protest posture. COFC has jurisdiction only in limited circumstances. GAO typically does not hear OT protests. This does not mean OTs are unprotestable in practice, but the bar and the venue are different, and industry counsel will tell you so.
Document as if protested, regardless of vehicle:
If the peer review documentation is defensible to a GAO attorney on the phone, the award can usually defend itself. If it is not, that is a signal to strengthen the record now rather than after a protest lands.
Closeout follows the underlying vehicle. A FAR contract closes per FAR Part 4 procedures. An OT closes per the OT agreement's closeout provisions. A few CSO-specific things deserve attention at closeout.
AOI quality review:
Feedback loops:
The umbrella and the AOI template are institutional assets. They are not one-time documents. The closeout file is the starting point for the next CSO's lessons-learned brief and for the next iteration of the organization's CSO playbook.
Concepts covered what a CSO is and when it fits. Execution covered standing one up and running it through closeout. If you want the whole thing told as a story before you close the tab, the What it looks like tab walks one from customer call to follow-on. From here, the Training page has adjacent specializations; the FAR Overhaul Tracker covers the RFO changes that touch award clauses; and the FAR Comparator helps when a clause question comes up mid-process.
The Concepts and Execution tabs explain what a CSO is and how to run one. This tab shows a whole one happening, from customer phone call to award to follow-on, so the mechanics feel ordinary instead of exotic.
Along the way you will see three real-world comparators that map onto CSO phases: a job posting for how you write the AOI, Shark Tank (minus cameras, minus the "I'm out") for how peer review actually feels, and a farmer's market for how the price check works when multiple independent offers come in. The contracting vocabulary is kept to a minimum on purpose. The other tabs do the precise work.
Captain Reyes runs the base processing center. Incoming personnel spend more than two hours in line on a normal day, and the summer PCS surge is eight weeks out. Her team is burnt, the customers are frustrated, and she has no idea whether the fix is kiosks, an app, more contract staffing, or something she has not thought of. She calls you.
You notice the shape of the problem. She can describe the outcome cleanly. She cannot describe the solution, and she probably should not be the one describing it. That is a CSO-shaped problem.
Your shop already has an umbrella CSO running. It has been on SAM.gov since the start of the fiscal year with the standard scope language and the evaluation approach spelled out. That means you do not need to stand up a whole new solicitation; you just need to post a new Area of Interest under the umbrella.
A good job posting describes the role and what success looks like. It lists the real constraints ("must hold a clearance," "on-site three days a week") and leaves room for a range of candidates to apply. A bad job posting reads like a single resume someone already has on their desk; a bad AOI reads like a single vendor's catalog.
You are describing what "fixed" looks like for Capt. Reyes, not picking the winner in advance.
Here is roughly what your AOI says:
Nothing in the AOI names a product, a vendor, or a technical approach. If the right answer is kiosks, that is fine. If the right answer is a scheduling app, that is also fine. If the right answer is a managed staffing model, that is fine too. You are paid to judge between them, not to pick the shape in advance.
Four weeks later, the first review window closes with three responsive proposals. They are answering the same problem, but they are answering it differently enough that the peer review panel will have something real to choose between.
Self-service check-in kiosks for routine transactions, live staff routing for exceptions, simple LDAP integration. Familiar technology, known implementation pattern.
$385,000 fixed-priceSmartphone app with pre-arrival document upload, appointment slots, and wait-time visibility. Requires a standing data-flow agreement and a new SSO pattern.
$412,000 fixed-priceOutsourced front-desk staffing with surge elasticity, tied into existing processes. No technology change; the answer is more throughput via more people during peaks.
$398,000 first-year fixed-priceThree different shapes. Three comparable price bands. Each one could plausibly get the wait time under thirty minutes. The panel's job is to judge which one actually will.
Your evaluation panel is three people: an IT SME from the base comm squadron, Capt. Reyes's deputy from the personnel office, and a security SME who cares about how identity verification and document handling are actually done.
On Shark Tank, each Shark reads the pitch on their own, asks clarifying questions, and decides independently what they think. Then the Sharks talk to each other, and the group sometimes shifts opinions. Nobody is pretending to be objective about a rubric; they are applying their expertise to a problem they understand.
Your three reviewers do the same. Read independently first. Score against the AOI criteria. Then meet. Minus the theater, and with the CO running the process but not voting.
Independently, the three reviewers come back with recognizable patterns. The IT SME likes Vendor A's simplicity and is nervous about Vendor B's new SSO. The personnel deputy likes Vendor B's pre-arrival document upload because it attacks the paperwork bottleneck directly. The security SME likes Vendor A because the LDAP integration is one he has seen work. Vendor C gets polite nods and limited enthusiasm; the panel feels it treats the symptom (throughput) without addressing the root cause (process friction).
In the group discussion, the personnel deputy's point about pre-arrival uploads moves the IT SME. The security SME holds his position. The panel reaches consensus in about an hour: Vendor A is the recommendation, with the note that Vendor A's proposal should be the one awarded because it is the lowest-risk path to the wait-time outcome, with a straightforward technology footprint the base can actually absorb.
No tradeoff narrative. No adjectival color ratings. No three hundred pages of source-selection documentation. A structured template per reviewer, a consensus memo, and the CO's award decision.
During the independent read, the IT SME flags a question about Vendor A's proposal. The proposal mentions "directory services integration" but never states explicitly whether it connects via LDAP or requires Active Directory. The base runs LDAP; the AOI said so. The proposal is probably compliant, but you want it on the record.
You call Vendor A. The question is: can you confirm that your solution supports LDAP, which is what the base runs? Vendor A says yes, the base price already includes the LDAP connector. You capture the exchange in the file as a clarification.
You asked the proposal to point at something it already claimed. You did not invite Vendor A to change their offer, add scope, or re-price. That is the line between a clarification (inside peer review) and a revision (would convert the AOI to a negotiated procurement). On this AOI you stayed on the clarification side of the line and the file reflects it.
The pricing file writes itself, almost. Three independent offerors came in at $385K, $398K, and $412K. That is a spread of roughly seven percent across genuinely different solution structures, all fixed-price. Competition doing its job.
If you walk up to a farmer's market and five vendors are selling tomatoes between $3.75 and $4.50 a pound, you do not need an audit or a cost breakdown to know that $4 is reasonable. The market itself is telling you what reasonable looks like.
That is exactly what Adequate Price Competition means on this CSO. Three independent offerors, responding to the same AOI, priced inside a tight band. The competition carries the reasonableness determination. The PNM walks the five APC conditions against these facts in about a page.
Because the solutions are commercial, certified cost or pricing data is not in play either way. Commercial items are exempt by rule. The PNM says so in one sentence and moves on. No cost buildup, no should-cost model, no audit request. Price analysis anchored on the competed offers, backed by an independent government estimate that came in at $390K. The award price of $385K sits cleanly inside the band.
Vendor A wins. You award a FAR commercial-item contract on an SF 1449 with the Part 12 clause set, priced at $385K fixed-price, with a ninety-day sustained-performance measurement window after go-live.
You send post-award notice to Vendors B and C the same day. Both request debriefings; you run the commercial-item debrief pattern, which is lighter-touch than a Part 15 source-selection debrief but still covers the basics of why their proposal was not the one awarded. Both vendors thank you. Neither protests.
AOI posted → award: about ten weeks. First review window four weeks, peer review and clarifications about two weeks, pricing file and PNM about a week, contract writing and internal review about two weeks, signature and post-award notice the tenth week. Capt. Reyes gets her kickoff meeting with Vendor A two weeks before the summer surge begins.
Summer arrives. The kiosks hold. Wait times average under twenty minutes during the surge, well inside the AOI's thirty-minute floor. The paperwork chain-of-custody does not break. Capt. Reyes's team sleeps.
Six months in, a sister base across the region hears about the implementation and asks their CO whether they can get the same thing. Their requirement is recognizably the same shape: same wait-time problem, same building footprint, same LDAP environment.
The FY26 NDAA change matters here. Because the original CSO was openly competed with peer review and Vendor A was the awardee, the sister base can pursue a sole-source follow-on production contract to Vendor A for their own implementation, without running a full new competition. That path is built into the statute. The sister base still has to write a proper justification, still has to do its own price reasonableness analysis on the follow-on (APC attached to the original award, not to the new one), and still has to document its own approvals. But the competitive rework that used to come with a new base's requirement is gone.
One CSO umbrella, one AOI, one competed award, one sister-base follow-on. Ten weeks to original award, a few weeks to the follow-on. Three vendors got a fair look. The problem got solved. The file is defensible. This is what a CSO is supposed to feel like.
The Concepts tab explained what authority lets this happen. The Execution tab covered how to actually run each phase. This tab was the same story told as a story, so that the first time you get your own Capt. Reyes call, the shape feels familiar.
Start in Execution section 1 ("Before you stand one up") for the pre-publication scoping conversations, or jump back to Concepts for the authority and structure refresher.
The naming gets tangled in practice and nobody writes it down cleanly, so here it is cleanly. A CSO produces two separate pieces of paper on SAM.gov. Together, they are "the solicitation." Separately, they have their own names and do different work.
The sections below show representative language for each document, tied to the Capt. Reyes scenario from the previous tab so you can see the umbrella, the AOI, the clarification email, and the post-award notice as they would actually read. The values in brackets ([Date], [Installation]) are placeholders you would fill in for a real procurement.
This is the parent announcement. It goes up once per fiscal year (or longer), stays on SAM.gov, and says nothing specific about any one problem. It describes who you are, what authority you are using, how submissions work, and how proposals get evaluated. Individual AOIs reference it but do not repeat it.
This Commercial Solutions Opening is issued under 10 U.S.C. § 3458 (Commercial Solutions Opening) as amended, implemented by DFARS Subpart 212.70.
The [Issuing Activity] invites submission of proposals for commercial products, commercial services, and nondevelopmental items addressing the capability needs described in Areas of Interest (AOIs) posted under this announcement.
AOIs posted under this CSO will address installation operations, mission support, personnel administration, and related capability areas. Specific problem statements will be described in each AOI. This list is illustrative; the active AOIs posted under this announcement are controlling.
Proposals shall be submitted in response to a specific AOI. Proposals submitted without reference to an active AOI will not be evaluated. Each AOI will state its own submission format, page limits, and deadline.
Proposals will be evaluated by peer review, conducted by subject-matter experts against criteria stated in the specific AOI. No adjectival ratings or tradeoff narratives will be produced. The Contracting Officer will document the peer review outcome and make the award decision.
Awards made under this CSO may be FAR commercial-item contracts (fixed-price or fixed-price incentive) or Other Transaction Agreements authorized under 10 U.S.C. § 3458 as amended. Cost-reimbursement is not authorized.
This umbrella announcement is effective through 30 September [Year]. Amendments, if any, will be posted to SAM.gov as notice updates.
This CSO is open to all responsible sources. Individual AOIs may carry set-aside or other competition restrictions; any such restriction will be stated in the AOI itself.
Active AOIs are posted as separate SAM.gov notices and are linked from this announcement's attachments section. Vendors shall read the applicable AOI in conjunction with this umbrella announcement before submitting a proposal.
This is the AOI that would have gone up for Capt. Reyes's processing-center problem. It is the document the three vendors actually responded to. Short, problem-focused, with outcome floors instead of a technical specification.
The base processing center handles inbound personnel check-in, outbound PCS out-processing, and related personnel administrative transactions for the installation. Average customer wait time during normal operations exceeds two hours. Seasonal surge periods produce further degradation and sustained customer complaints. The installation is seeking solutions that reduce wait times to a managed level without compromising the integrity of personnel document handling or identity verification.
The proposed solution shall sustain an average customer wait time of under 30 minutes across a continuous 90-day measurement window that includes at least one seasonal surge period. Measurement begins at customer arrival at the processing center and concludes at completion of the transaction.
Evaluation will be conducted by a three-person peer review panel drawn from the installation's IT, personnel, and security functions. No adjectival ratings will be applied.
[CO Name], [Office], [Email], [Phone].
Written questions will be accepted through [Date]. Responses will be posted as an amendment to this AOI on SAM.gov.
This is the email you would send to Vendor A during peer review when the IT SME flagged the LDAP question. It is a clarification (asking the proposal to point at something it already claims), not a revision (inviting a change). The language makes that line explicit on the record.
Subject: Clarification Request, AOI-W12345-25-002, [Vendor] Proposal
[Vendor POC],
During peer review of your proposal submitted against AOI-W12345-25-002 (Base Processing Center Wait-Time Reduction), the panel asked for clarification on the directory services integration approach described in your proposal.
Your proposal states that the solution "integrates with standard directory services for user authentication" (page 7). The AOI specified LDAP integration as a real constraint. Please confirm whether LDAP connectivity is included at the proposed base price, or whether an additional module, license, or services effort would be required to achieve LDAP integration in the installation's existing environment.
This request is a clarification, not an invitation to revise your proposal. Please limit your response to identifying or confirming information already contained in your submitted proposal. A written response is requested by [Date].
Respectfully,
[CO Name]
Contracting Officer
[Contracting Office]
After award, unsuccessful offerors get a post-award notice. Commercial-item procedures under Part 12 allow a lighter touch than a Part 15 source-selection debrief; the obligation to inform is unchanged, the procedural formality is reduced.
Subject: Post-Award Notice, AOI-W12345-25-002
[Offeror POC],
This notice confirms that award for AOI-W12345-25-002 (Base Processing Center Wait-Time Reduction) was made on [Date] to [Awardee], at a total evaluated price of $385,000.
Three responsive proposals were received and evaluated by peer review. The awarded offeror was selected as providing the best overall response against the evaluation criteria stated in the AOI.
A brief post-award explanation is available upon written request submitted within five business days of the date of this notice, in accordance with the commercial-item procedures in FAR Part 12. A request for explanation shall be addressed to the undersigned.
The [Contracting Office] appreciates your proposal and your interest in supporting the [Installation] mission.
Respectfully,
[CO Name]
Contracting Officer
[Contracting Office]
A few wording patterns show up in draft AOIs and quietly weaken the solicitation. They are not illegal, and they usually reflect a program office borrowing language from a prior spec-based procurement. They are worth catching at draft review.
"Solution shall be based on the Acme CheckPoint 3000 kiosk platform or equivalent."
Products in the AOI signal the outcome already has a preferred shape. Drop the CSO vehicle and run a Part 12 branded-item buy; that is what this language actually describes.
"Solution shall provide customer self-service for routine check-in transactions with staff routing for exceptions."
Outcome-based phrasing leaves room for a kiosk answer, an app answer, or a staffing answer. The peer review, not the AOI, picks between them.
"Proposals rated Excellent on technical approach will be given preferential consideration at award."
Adjectival ratings belong to the Part 15 source-selection world, not to CSO peer review. Language like this invites a protester to argue the procurement was actually run as a Part 15 tradeoff and should have followed Part 15 procedures.
"Proposals will be evaluated by peer review against the criteria stated in this AOI. No adjectival ratings or tradeoff narratives will be produced."
Plain about the evaluation method. Names the criteria as the thing being applied. Forecloses confusion about what the reviewers are producing.
"This AOI addresses personnel, IT, facility, and logistics challenges at the installation."
Four problems in one AOI. Peer reviewers will be asked to compare proposals that are answering different questions, and the competition looks hollow on the record.
"This AOI addresses customer wait-time reduction at the base processing center, as described below."
One problem, scoped. If the other topics are also real, they get their own AOIs under the same umbrella. That is exactly what the umbrella is for.
"Offerors shall have prior experience providing services at [Installation] or similar installations in the region."
Reads as an incumbency filter. If the incumbency is actually a necessary qualification, it belongs in a documented sole-source justification, not in a supposedly open competition.
"Offerors shall describe prior experience integrating with LDAP directory services in similar operational environments."
Same idea (relevant experience) expressed against a capability, not against an incumbency. Any offeror with the capability can respond.
(LDAP integration requirement appears only in a single sentence in an unrelated paragraph near the end.)
A real constraint that shows up buried will produce proposals that miss it, a round of amendments three weeks late, and an inconsistent record.
"Real Constraints: (1) Identity verification shall integrate with LDAP. (2) Document chain-of-custody per SOP shall be maintained. (3) Existing building footprint."
Constraints labeled, numbered, and separated from preferences. Vendors read them. Peer reviewers check against them.
Between this tab and the scenario, you have seen a whole CSO happen and the actual documents it produces. If you want to review the mechanics that hold it together, the Execution tab is the reference. The Concepts tab covers the authority and the structure.