Most "how to get your 10DLC campaign approved" guides online list the form fields and stop there. The actual approval process has a layer of unwritten conventions that the form doesn't capture, and getting it right on the first try is the difference between sending in 3 days and sending in 3 weeks.
This is what we've learned from filing dozens of campaigns across our customer base — what carriers actually scrutinize, the patterns that pass, the patterns that bounce, and the small things that make reviewers say yes.
Who's actually reviewing your campaign
Three layers of review:
- The Campaign Registry (TCR). A central database that validates brand info, screens campaigns against use-case rules, and sets baseline content standards. TCR approval is usually quick (under 72 hours) when the submission is clean.
- Individual carriers (T-Mobile, AT&T, Verizon, US Cellular). Each carrier independently reviews your registered campaign against its own content policies. They can approve, reject, or accept-with-reservations. T-Mobile is consistently the strictest.
- Your messaging provider. Some providers do their own pre-screen before submitting to TCR. We do — partly because it catches errors before they cost you a rejection cycle, partly because it's faster to fix locally than to undo a TCR submission.
Each layer can reject independently. Approval at TCR doesn't mean approval at every carrier, and a campaign that delivers fine to AT&T can get blocked at T-Mobile if it crosses a content line that TCR didn't catch.
What they look at
In rough order of weight:
Brand identity match
The brand information you submit must match what's on file with the IRS, the state of incorporation, and the public web. Specifically:
- Legal business name must match exactly. "Acme, Inc." is not the same as "Acme Inc" or "Acme Incorporated." Reviewers check against state corporate registries.
- EIN must match the legal name. Mismatches are an instant rejection.
- Business address must match the address on file with the IRS.
- Website domain must match the email domain you submit and must resolve to a real, active website. A parked domain or a domain that 404s is a flag.
- DUNS number is optional but adds weight if you have one.
The one that catches the most operators: the legal-name comma. State registries list business names with very specific punctuation. Match it character-for-character.
Opt-in flow description
The form asks how customers opt in. Carriers don't want a description; they want the actual user-facing flow.
What works:
- "Customers visit [URL] and enter their phone number into a form. The form contains the language: 'Reply Y to receive up to 4 messages per month from [Brand] including appointment reminders. Standard message and data rates apply. Reply STOP to opt out.' Submitting the form triggers an immediate confirmation SMS."
- A screenshot of the actual opt-in form (some platforms let you upload one).
- A link to the live opt-in page.
What gets rejected:
- "Customers can opt in to our service."
- "Existing customers receive notifications."
- "Opt-in is collected during signup."
The pattern is concrete > abstract. Reviewers want to be able to picture exactly what the consumer sees.
Sample messages
Two samples minimum (one inbound from customer, one outbound from you). More is better — show the variety of messages you'll actually send under this campaign.
Required content in every sample:
- Brand name — the recipient must know who's texting them.
- Opt-out language — at least once in your flow, typically in the welcome message: "Reply STOP to opt out."
- Frequency disclosure in the opt-in confirmation — "Reply Y for up to 4 msgs/mo from [Brand]."
- "Msg & data rates may apply" in the help message and at minimum once in the welcome.
What gets rejected:
- Generic templates: "Reminder: your appointment is tomorrow" — no brand, no opt-out, fails on multiple counts.
- Mismatch between samples and use case: marketing-flavored samples submitted under a Customer Care use case (or vice versa).
- Sample messages that promise things you'd actually send under a different campaign — "Reply YES for our weekly newsletter" in a transactional campaign suggests you'll mix marketing in.
- Hypothetical/placeholder samples with
[INSERT NAME]style brackets visible. Replace with realistic content.
A passing pattern looks like this:
Welcome: "[Acme] You've signed up for appointment reminders. Reply Y to confirm. Up to 4 msgs/mo. Msg & data rates may apply. Reply STOP to opt out, HELP for help."
Reminder: "[Acme] Your appointment is tomorrow at 2:00 PM at our Main St. office. Reply C to confirm or R to reschedule. Reply STOP to opt out."
Help: "[Acme] For support contact us at help@acme.com or 555-555-5555. Msg & data rates may apply."
That's three samples that, taken together, demonstrate what you'll send, that you have proper consent flow, and that you handle the basic compliance pieces.
Use case match
Pick the use case that most closely matches what you'll actually send. If your samples and use case don't agree, the campaign bounces. The common bad combinations:
- Marketing campaign with samples that look transactional — if you're going to send promotions, the samples should include promotional content. Reviewers don't accept "I picked Marketing but my samples are reminders just for review purposes."
- Customer Care campaign with samples that include CTAs to buy — call-to-actions on a Customer Care campaign read as marketing.
- Mixed campaign with samples from only one category — if you registered Mixed because you'll send both transactional and marketing, show both.
Match what you said with what you'll send. Reviewers are pattern-matching humans; consistency reads as honest.
Privacy policy (2026 requirement)
As of 2026, your business website must have a privacy policy that's:
- Reachable at a public URL on your business domain.
- Specifically describes SMS opt-in data handling — phone numbers collected, what messages are sent, retention, opt-out.
- Not a generic templated policy. Reviewers can spot generic Termly/iubenda templates and flag them.
For most businesses this means adding a paragraph to an existing privacy policy specifically about SMS — what data you collect when someone opts in, what you do with it, how to opt out, who can see it. The paragraph should also reference the carrier-side disclosure ("standard message and data rates may apply").
If you don't have a privacy policy, you'll need to publish one before the campaign is approved.
The unwritten rules
Things reviewers care about that aren't on any official checklist:
- Don't share numbers across brands. "Number pooling" — using the same long code under multiple brand registrations — is now treated as a fraud signal. One brand per number.
- Avoid public URL shorteners in your sample messages. bit.ly and tinyurl in samples flag you as suspicious. Use a branded short-domain you own, or full URLs.
- Avoid scary content patterns even in samples. All caps, exclamation marks, dollar symbols, and "act now" phrasing all hurt. Even if it's "just a sample" — reviewers can't differentiate.
- Don't submit campaigns from a brand that's been recently rejected on another platform. Brand reputation is cross-platform. If you're on a second messaging provider after a rejection elsewhere, expect more scrutiny on the second submission.
- Match volume estimates to reality. If you say you'll send 10,000 messages per day on a single campaign and your actual sending pattern is 100 a day, that mismatch can come up in review on subsequent campaigns.
When carriers reject
T-Mobile, AT&T, and Verizon each have their own content policies layered on top of TCR. The most common carrier-level rejections:
- T-Mobile content scrutiny. They're the strictest on financial services, lending, debt collection, and adult-adjacent content. Healthcare campaigns sometimes get pushback if samples include anything that could be read as referring to specific diagnoses or conditions.
- AT&T format checks. They're stricter on sample format compliance — every required disclosure must be in every sample, exactly. Missing "msg & data rates may apply" in even one sample triggers rejection.
- Verizon volume signals. They look at submitted volume vs. sender history. A new brand requesting high throughput with no prior sending pattern gets pushback.
Carrier rejections don't always come with detailed reasons. The fix is usually to revise samples to be cleaner (more disclosures, less marketing flavor on transactional campaigns, more conservative content) and resubmit through your messaging provider.
What clean approval looks like, end to end
- Day 0: You submit clean brand info matching state and IRS records.
- Day 0: Brand approved within hours (or 1–2 days for vetted).
- Day 0–1: You submit a campaign with use case, opt-in flow description (link to live form), and 3+ samples that all include required disclosures.
- Day 1–3: Campaign approved by TCR.
- Day 2–4: Carriers complete their independent review. Most pass.
- Day 3–5: First message goes out.
When it doesn't go cleanly, each rejection round adds 2–4 days. Two or three rounds and you're at 2–3 weeks. The cost of getting it right the first time is small attention to detail; the cost of getting it wrong is repeated cycles.
A short checklist before you submit
- Legal name matches state registry exactly (including punctuation)
- EIN matches the legal name on IRS records
- Address matches IRS record
- Website is live, on the same domain as your email
- Privacy policy is published, reachable, mentions SMS specifically
- Opt-in flow is described concretely (or include a screenshot/URL)
- Use case picked matches what you'll actually send
- At least 3 sample messages, each containing brand name, opt-out, and frequency disclosure
- Welcome sample includes "Msg & data rates may apply"
- Help sample includes contact info + "Msg & data rates may apply"
- No URL shorteners in samples
- No bracketed placeholders ([NAME], [DATE]) in samples — use realistic content
That's most of it. Clean submission, fast approval, predictable outcome. The system rewards specificity and punishes vagueness, and once you get used to the rhythm, it's not difficult — just exact.