The problem
If robots.txt disallows Twitterbot, the crawler can’t access your pages and no card preview gets generated. Tweets show a bare URL.
Check your robots.txt
Visit https://yoursite.com/robots.txt and look for rules that block Twitterbot:
# This blocks Twitter Cards:
User-agent: Twitterbot
Disallow: /
# So does this (blocks all bots):
User-agent: *
Disallow: /
The fix
Add an explicit allow rule for Twitterbot before any blanket disallow:
User-agent: Twitterbot
Allow: /
User-agent: *
Disallow: /admin/
To allow only specific paths:
User-agent: Twitterbot
Allow: /blog/
Allow: /products/
Disallow: /
Gotchas
- Order matters: more specific rules override less specific ones, but put
AllowbeforeDisallowfor the same user agent to be safe. - Cached robots.txt: Twitter caches
robots.txtseparately from page content. After updating, you may need to wait for the cache to expire or use the Card Validator to trigger a re-fetch. - CDN/WAF blocking: your CDN or firewall might block the crawler by IP or user agent even if
robots.txtallows it. Check server access logs to confirm requests are reaching your origin.
Verify the fix
curl -A "Twitterbot/1.0" -I https://yoursite.com/your-pageYou should see a 200 OK. Then use the Card Validator to force a re-crawl.