Back to all articles
12 min

FEC Compliance for Political Social Media: The Disclosure Rules Every Campaign Must Follow in 2026

FECcompliancepolitical advertisingsocial mediacampaign lawdisclosure
FEC Compliance for Political Social Media: The Disclosure Rules Every Campaign Must Follow in 2026

[IMAGE: A visual maze constructed from legal documents, social media platform logos, and regulatory agency seals (FEC, FCC). A clear path is highlighted through the maze in gold, representing the compliance route. Professional, clean design.]

Thesis: The gap between what the law requires and what platforms enforce creates a compliance maze that most campaigns navigate by guesswork — and guesswork is how enforcement cases start.

Political social media compliance in 2026 is not one set of rules. It's a layered system of federal regulations, state laws, platform-specific policies, and emerging AI disclosure requirements that often conflict with each other. The FEC sets the federal floor, 28 states have added their own deepfake laws, each platform enforces its own ad policies, and the AI regulation landscape changes monthly.

Most campaigns either over-comply (wasting resources on unnecessary restrictions) or under-comply (creating legal exposure they don't know about). This guide maps the actual landscape.


Federal Baseline: FEC Disclaimer Rules

The Core Requirement (Effective March 1, 2023)

Every political ad — including digital ads — must include a disclaimer that is "clear and conspicuous." The disclaimer must state:

  • Who paid for the communication (e.g., "Paid for by Smith for Congress")
  • Whether it was authorized by a candidate (e.g., "Authorized by Smith for Congress" or "Not authorized by any candidate or candidate's committee")

These requirements apply to any public communication that expressly advocates for or against a clearly identified federal candidate, or that solicits contributions.

The 25% Rule for Small Digital Ads

For digital ads where a full disclaimer cannot reasonably fit (think: small display ads, short social media formats), the FEC adopted an adapted disclaimer framework:

Component Requirement
Adapted disclaimer Abbreviated version that fits the ad format
Indicator Clear visual or text cue that more information is available
One-action mechanism The viewer must be able to access the full disclaimer in one click/tap/action

This means a small Instagram story ad doesn't need the full "Paid for by..." text on screen, but it does need a visible indicator (like a "Paid for by..." link) that takes the viewer to the complete disclaimer in one action.

FEC Enforcement Reality

As of July 2024, the FEC had 101 active enforcement cases. These aren't theoretical — they involve real campaigns, real fines, and real legal costs. The FEC may be a notoriously slow-moving agency, but cases do resolve, and the penalties can be significant.

[IMAGE: Infographic showing a political social media ad with callouts pointing to required disclaimer elements — the "Paid for by" text, the authorization statement, and for small formats, the indicator and one-action mechanism. Side panel showing "101 active FEC enforcement cases as of July 2024."]


The AI Disclosure Patchwork

Federal Level: FCC vs. FEC

The regulatory picture at the federal level is split:

  • FCC (Federal Communications Commission): Has proposed AI disclosure rules, but they apply only to radio and TV broadcasts. Social media is not covered by the FCC's proposed rules.
  • FEC (Federal Election Commission): Has not yet adopted specific AI disclosure requirements for political ads.

This means there is currently no federal requirement to disclose AI-generated content in political social media ads. The gap is real and significant.

State Level: 28 Laws and Counting

While the federal government deliberates, states have moved aggressively:

  • 28 states have enacted some form of deepfake disclosure law applicable to political contexts
  • 146 bills related to AI in elections were introduced across state legislatures in 2025 alone

The state laws vary dramatically in scope, enforcement mechanisms, and penalties. Some require disclosure of any AI-altered content. Others focus specifically on deepfakes of candidates. Some apply only during election periods. Others are year-round.

Key State Cases

California AB 2839: This was one of the most ambitious state attempts — it would have required disclosure of AI-generated political content with significant penalties. It was struck down on First Amendment grounds. The court found that the law's restrictions on political speech were overly broad.

TAKE IT DOWN Act (May 2025): Federal legislation that specifically addresses intimate deepfakes — non-consensual intimate imagery generated by AI. Importantly, this law covers only intimate content, not political deepfakes more broadly.

Trump Executive Order: The administration has challenged state AI laws, creating additional uncertainty about which state-level requirements will survive legal challenges.

The Compliance Matrix

Jurisdiction AI Disclosure Required? Scope Enforcement
Federal (FEC) No specific AI rule yet N/A N/A
Federal (FCC) Proposed for radio/TV only Broadcast only Not yet effective
28 states Yes, various Varies by state Varies — some active, some toothless
California AB 2839 struck down N/A N/A
Federal (TAKE IT DOWN) Yes, intimate deepfakes only Intimate imagery only Criminal penalties

Platform-by-Platform: Where the Real Rules Live

In practice, platform policies are often more immediately consequential than government regulation. Here's what each major platform requires:

Meta (Facebook & Instagram)

Requirement Detail
Paid political ads Allowed in the US
Ad Library All political ads archived for 7 years
AI disclosure Required — advertisers must disclose AI-generated or AI-altered content
Pre-election blackout 7-day restriction on new political ads before elections
Verification Advertiser identity verification required

Meta's Ad Library is the most comprehensive transparency tool in the political advertising space. Every political ad, including targeting parameters and spend ranges, is publicly accessible for 7 years.

Google / YouTube

Requirement Detail
Paid political ads Allowed with verification
Verification Identity verification required for election advertisers
AI/synthetic content Disclosure required since 2025
Sensitive topic labels Auto-applied to content about elections, candidates
Auto-disclosure Platform may add labels even if advertiser doesn't

X (formerly Twitter)

Requirement Detail
Paid political ads Allowed in 38 countries
Pre-approval Required before political ads can run
Disclaimer format "Paid For By" must be machine-readable
Expanded definition (2026) Now includes AI-related, crypto, and data privacy political ads
Policy updates 14 major policy updates in 2026 alone

X's expanded definition of political advertising in 2026 is significant: ads about AI policy, cryptocurrency regulation, and data privacy are now classified as political ads requiring disclosure. This catches many issue advertisers who didn't previously consider their content "political."

TikTok

Requirement Detail
Paid political ads Complete global ban
Enforcement Problematic — NBC investigation found 52 videos violating the ban with "Paid Partnership" tags
Organic political content Allowed

TikTok's complete ban on paid political advertising is the most restrictive policy among major platforms. However, NBC's finding of 52 violating videos demonstrates that enforcement has significant gaps.

[IMAGE: Four-quadrant comparison chart showing Meta, Google/YouTube, X, and TikTok political ad policies side by side. Each quadrant has a green/yellow/red indicator for: ads allowed, AI disclosure required, pre-election restrictions, and enforcement strength.]


The Two Gaps Nobody Talks About

The Influencer Gap

There is currently no FEC requirement for paid political endorsement disclosure by influencers. A campaign can pay an influencer to promote a candidate, and while the FTC has general endorsement guidelines, the FEC has not established specific rules for political influencer partnerships.

This gap means:

  • Campaigns can route political messaging through influencers without standard disclaimer requirements
  • The line between organic support and paid promotion is invisible to voters
  • Enforcement is essentially nonexistent in this space

The Streaming Gap

An estimated $2.3 billion in political advertising flows through streaming platforms, yet the FEC's rules don't specifically mention streaming. This creates a regulatory gray zone where:

  • Connected TV and streaming ads may not technically fall under the same requirements as broadcast TV
  • Political ad transparency on streaming platforms varies wildly
  • Some streaming services apply broadcast-style rules voluntarily; others don't

These two gaps represent the largest areas of regulatory uncertainty in political digital advertising.


Practical Compliance Framework

For Every Piece of Political Content, Ask:

  1. Is this a "public communication" under FEC rules? If it expressly advocates for/against a federal candidate or solicits contributions, it needs a disclaimer.
  2. Does the format accommodate a full disclaimer? If yes, include it. If no, use the adapted disclaimer + indicator + one-action mechanism.
  3. Was AI used in creation? Check the platform's AI disclosure policy and applicable state laws.
  4. Which states will see this ad? Check those states' specific disclosure requirements.
  5. Is this going through an influencer? Document the arrangement even though FEC rules don't yet require disclosure — this is where regulation is heading.

Compliance Checklist by Platform

Step Meta Google/YT X TikTok
Verify advertiser identity Required Required Required N/A (no paid ads)
Include FEC disclaimer Yes Yes Yes N/A
Disclose AI content Required Required Check expanded definition N/A
Observe pre-election restrictions 7-day blackout None specified None specified N/A
Archive documentation Platform archives 7 years Platform archives Platform archives N/A
Review state-specific rules Yes Yes Yes Yes (for organic)

What to Watch in 2026

The compliance landscape is moving fast. Key developments to monitor:

  • FEC AI rulemaking: The commission is under pressure to establish AI-specific disclosure rules. Any rule could take effect with relatively short notice.
  • State law challenges: The California AB 2839 decision and the Trump executive order on state AI laws will shape what survives judicial review.
  • Platform policy changes: X's 14 major policy updates in 2026 demonstrate how quickly platforms can change the rules. What's compliant today may not be tomorrow.
  • Streaming regulation: With $2.3 billion in political streaming ads, regulatory attention is inevitable.
  • Influencer enforcement: The FTC is increasingly active on endorsement disclosure, and political influencer partnerships are a logical next target.

Sources

  • Federal Election Commission, disclaimer rules for public communications (effective March 1, 2023)
  • FEC adapted disclaimer framework — 25% rule for small digital ads
  • FEC enforcement data — 101 active cases as of July 2024
  • FCC proposed AI disclosure rules — radio/TV only, does not cover social media
  • National Conference of State Legislatures — 28 states with deepfake disclosure laws, 146 AI bills introduced in 2025
  • California AB 2839 — struck down on First Amendment grounds
  • TAKE IT DOWN Act (May 2025) — intimate deepfakes federal legislation
  • Trump executive order challenging state AI laws
  • Meta Ad Library and political advertising policies — 7-year archive, AI disclosure, 7-day blackout
  • Google/YouTube election advertising verification and AI/synthetic content rules (2025)
  • X (Twitter) political advertising policies — 38 countries, pre-approval, expanded definition 2026, 14 major policy updates in 2026
  • TikTok political ad ban — NBC investigation finding 52 violating videos
  • Streaming political advertising estimates — approximately $2.3 billion
  • FEC influencer disclosure gap analysis

Compliance isn't a one-time checklist — it's an ongoing process that changes every time a platform updates its policies or a state passes a new law. If your campaign is spending real money on digital political communication, you need a team that tracks this landscape continuously. Let's make sure you're covered →