How to define reports, meetings, data, formats, due dates, review windows, CDRLs, and who accepts each deliverable after award.
A deliverable is anything the contractor owes the Government in a usable form by a known date, with someone assigned to review it. Reports, briefings, data exports, plans, manuals, training materials, closeout records, and recurring submissions all count.
Bad deliverables usually fail because the Government asked for "monthly status" and then expected the contractor to guess the format, content, due date, delivery method, review process, and acceptance standard. The contractor isn't trying to be difficult; they're trying to fill in the blanks left in the contract.
You are part of the acquisition team here. The customer side names the deliverables that actually matter, describes them well enough to administer, and reviews them after award. Contracting can structure the requirement on the page; only you can say what's actually useful to your mission.
A deliverable is anything the contract requires the contractor to submit, provide, turn over, brief, update, transfer, or make available to the Government. Some deliverables are physical. Some are data. Some are recurring. Some only happen once at transition or closeout.
Monthly status reports, metrics, inspection results, risk logs, invoices with support, and corrective action plans.
Agendas, slides, minutes, decisions, attendance lists, action item trackers, and follow-up packages.
Data exports, databases, drawings, manuals, license records, configuration files, logs, and final turnover data.
As-builts, O&M manuals, warranty documents, training materials, transition plans, inventory records, and final reports.
The first customer job is to name the deliverables that actually matter. The second job is to describe them well enough that a stranger could administer the contract without reading your mind.
Every deliverable should survive five simple questions. If you can't answer one, the contractor probably can't either.
On DoD contracts, formal data deliverables often move through a Contract Data Requirements List, usually called a CDRL. The CDRL lives on DD Form 1423 and tells the contractor what data the Government wants, when it's due, where it goes, who reviews it, and what standard or description controls the content.
Think of it this way: the SOW or PWS tasks the work, and the CDRL orders the data that comes out of the work. If the contractor has to perform maintenance, the PWS describes the maintenance. If the Government also needs a monthly maintenance report, the CDRL defines that data deliverable.
DoD RFO Deviation Part 04 moved this into 204.202-70: the DD Form 1423 is always an exhibit rather than an attachment, and exhibits are used to separately identify data deliverables. DoD RFO Deviation Part 15 keeps 215.470: when data are required to be delivered under a DoD contract, include DD Form 1423 in the solicitation. Customer takeaway: a CDRL establishes real deliverable requirements, not background paperwork.
| CDRL concept | What it does | Customer question |
|---|---|---|
| Data item | Identifies the specific data deliverable, often with a title and data item number. | What exact data do we need from the contractor? |
| DID | A Data Item Description defines the content, format, intended use, and special requirements for the data. | Is there an existing DID in ASSIST Quick Search (the DoD's standardization library) that already describes this deliverable? |
| SOW/PWS reference | Connects the data deliverable to the contract task that creates the need for the data. | Which requirement paragraph actually creates this deliverable? |
| Inspection and acceptance | Identifies whether and how the Government reviews or accepts the data. | Who can say the data is complete and usable? |
| Timing and addressees | States first due date, recurring frequency, final delivery, and who receives it. | When do we need it, and who needs a copy? |
| Tailoring | Lets the Government clarify or limit a DID so the contractor only provides what this contract actually needs. | Are we asking for the useful parts, or buying a paperwork avalanche? |
Tailoring is where customers add real value. A DID was written to cover a wide range of acquisitions, so most DIDs include content that one specific contract doesn't need. The CDRL can mark a DID as tailored and list what's required and what's deleted. If you only need three of the seven sections in a status-report DID, say so. The contractor stops producing what nobody reads, and the file gets cleaner.
Real CDRLs use compact codes. These are the ones a customer reviewer is most likely to encounter when looking at a draft CDRL or a real contract.
| Code family | What it means | Customer-facing version |
|---|---|---|
| Approval code (Block 8) | A = approval required (Government must accept before the contractor proceeds). N = no review required (data is provided without formal review). I = information only. | How much do we actually need to review this? Code A means the COR or technical reviewer has to act; code N or I means the data flows without a review gate. |
| Frequency (Block 12) | Common shorthand: ASREQ (as required), OTIME (one time), MNTHLY, QRTLY, ANNUAL, DAA (days after award), DAC (days after contract), DRAFT/FINAL. | When does this start, how often does it repeat, and is the schedule tied to award, performance milestones, or the calendar? |
| Distribution Statement (Block 9) | Statements A through F (plus X) control who can see the data once delivered. A is most permissive (approved for public release); F is most restricted (further dissemination only as directed). Required on most DoD technical data. | What's the data, who should be able to see it, and what export-control or operational-security limits apply? Talk to your security manager. |
You don't need to memorize the codes; recognizing them on a CDRL keeps you from rubber-stamping a deliverable schedule that doesn't match what the Government actually needs. If a CDRL has every line marked "A" (approval required) and your team has nobody available to review monthly, that's a problem to surface before award.
Examples of common DIDs you'll see on real contracts:
Each DID is a few pages defining exactly what content the contractor must produce. Customers should pull the DID from ASSIST and read it before agreeing to a CDRL. If the DID asks for content the Government won't use, tailor it down on the CDRL.
Two related but separate questions about CDRL data: who can see it (distribution statement) and what the Government can do with it (data rights). Customers often conflate these. They live in different parts of the contract.
Distribution statement. Marked on the data when delivered. Tells anyone holding a copy who's allowed to see it and under what conditions. Statement A is approved for public release; B through F progressively restrict to U.S. Government agencies, contractors with a need-to-know, DoD only, and so on. The customer is usually the right person to nominate the statement based on what the data is, who needs to access it later, and what export-control or OPSEC limits apply. Coordinate with your security manager and the CO before assuming.
Data rights. Established by the contract clauses and the markings on the data itself. The CDRL orders the data; the clauses set the rights. Under the DoD RFO Deviation Part 27, the data-rights clause numbers customers are most likely to hear about include:
If the Government will need to use, share, modify, maintain, re-procure, or compete based on the data later, tell the CO and legal team early. Those rights have to be in the clauses, and the contractor has to mark the data correctly when they deliver. A CDRL by itself doesn't give the Government rights it didn't already negotiate.
On many DoD contracts, CDRL data deliverables are inspected and accepted in Wide Area Workflow (WAWF), the same place we covered in the Acceptance Criteria training. The contractor submits a Receiving Report tied to the CDRL line; the Government inspector and acceptor act on it electronically. If the data is junk, that's the place to reject it — with documented evidence, not a phone call.
If nobody will review the data, use it, or make a decision with it, ask why the Government is paying for it. The contractor has to produce every CDRL line item; the Government has to receive every CDRL line item. Both cost money. If a deliverable doesn't change a decision, it shouldn't be on the list.
"Monthly report" is one of the most dangerous phrases in a requirement package. It sounds harmless, but it leaves almost everything unresolved.
The contractor shall provide a monthly status report.
The contractor shall submit a monthly status report by the fifth business day of each month covering the prior month's performance. Report shall use the Government-provided template and include open actions, completed tasks, schedule risks, staffing changes, issues requiring Government decision, and invoice-supporting detail.
A useful report requirement identifies the reporting period, due date, template, required fields, submission method, reviewer, and what happens when required fields are blank or unsupported.
Meetings are sometimes required by contract. The meeting itself rarely is the deliverable; the useful Government artifacts live around the meeting.
If the contract requires recurring meetings, define what the contractor must submit before and after each one. Otherwise the meeting becomes a ritual instead of a management tool.
Data deliverables are where customers get hurt later. A dashboard looks great during performance, but if the Government can't export the underlying data at the end of the contract, the next team may start from zero.
Customers should identify the data they need after the contract ends, not just the data they want to view during performance.
This affects more than IT. Maintenance records, inspection data, training records, warranty records, licenses, configuration baselines, inventory, and closeout records all face the same problem.
If the Government needs time to review a deliverable, put that time in the requirement. If the contractor needs to revise and resubmit, explain how that works. Silence creates fake emergencies after award.
| Issue | Weak wording | Better question |
|---|---|---|
| Review period | Government will review. | How many business days does the Government have to approve, reject, or request correction? |
| Rejection | Contractor shall fix errors. | What counts as an error serious enough to reject the deliverable? |
| Resubmission | Contractor shall resubmit. | How long does the contractor have to resubmit, and does the Government review clock restart? |
| Late delivery | Reports are due monthly. | What does "late" mean, who gets notified, and what performance record captures it? |
The goal is predictable delivery and review for both sides — so the post-award conversation is about the work, not about whose process broke first.
The easiest way to find deliverable chaos before award is to build a simple matrix. It doesn't have to be beautiful. It does have to make the hidden assumptions visible.
| Column | What to put there | Why it matters |
|---|---|---|
| Deliverable | Name of the report, plan, data file, meeting artifact, product, or closeout item. | Prevents "I thought you meant..." after award. |
| Due date | Calendar date, days after award, days before event, monthly schedule, or final closeout trigger. | Lets the CO build enforceable timing. |
| Format | Template, file type, system, data fields, naming convention, or copy count. | Makes the deliverable usable. |
| Submitted to | COR, technical POC, shared inbox, system, or Government repository. | Stops deliverables from disappearing into one person's inbox. |
| Acceptance standard | Required content, review method, quality threshold, or reason for rejection. | Connects deliverables to the acceptance criteria training. |
You don't need to know whether the final contract will use a CDRL, an attachment, a PWS table, or another structure. Send contracting the facts so they can choose the right tool.
Read the draft requirement and highlight every sentence where the contractor must submit, provide, brief, report, update, or turn over something. Each highlighted item belongs in a deliverable matrix before the package goes to contracting.