Executive Summary
PDF PROWe performed a comprehensive security analysis of www.fbi.gov across 5 categories. The website received an overall score of 84/100 (grade B+), with 1 critical issue, 6 warnings, and 25 passed checks.
Overall assessment: www.fbi.gov has a reasonable security foundation but there is clear room for improvement. Several issues were identified that could expose the website or its users to unnecessary risk. We recommend addressing the critical issues first, followed by the warnings outlined below.
Top priority fixes:
Strong areas
DNS & Email Security
SSL & HTTPS
Content & CMS
Needs improvement
Security Headers
Needs work
Performance & SEO
Website Health Check
Simple overview for everyoneIs my website safe for visitors?
Yes — your website uses encryption and has security protections in place.
Can my website be found by Google?
Yes — your website is accessible to search engines and loads at a reasonable speed.
Is my email protected against spoofing?
Yes — your domain has email authentication records (SPF/DMARC) that prevent others from sending fake emails on your behalf.
Is my website leaking sensitive data?
No leaks detected — configuration files and sensitive data appear to be properly protected.
Does my website respect visitor privacy?
Yes — a privacy policy and cookie consent appear to be in place.
Trust & WHOIS
See domain age, registrar, expiry date, server location, and reputation checks across security databases.
Malware & Reputation
Check if your site is flagged by malware databases, blacklists, and antivirus vendors worldwide.
Advanced Security Checks
Detect open ports, exposed files, API vulnerabilities, TLS weaknesses, and subdomain takeover risks.
Privacy & GDPR
Analyze cookie consent, privacy policy presence, third-party trackers, and GDPR compliance signals.
Quality & Accessibility
Check accessibility compliance, robots.txt, branding, broken links, and carbon footprint.
Unlock the full security report
This Quick Scan covers 5 categories. Upgrade to Pro for OWASP Top 10 analysis, malware detection, exposed files, and 15 more scanners.
Full report
DNS & Email Security
83/100SPF record configured
SPF record found: "v=spf1 +mx ip4:153.31.0.0/16 -all".
DMARC record configured
DMARC record found with policy "reject": "v=DMARC1; p=reject; rua=mailto:dmarc-feedback@fbi.gov,mailto:reports@dmarc.cyber.dhs.gov; ruf=mailto:dmarc-feedback@fbi.gov; pct=100".
CAA record configured
CAA record found — only authorized Certificate Authorities can issue SSL certificates for this domain.
DKIM record configured
No DKIM record found for common selectors. DKIM cryptographically signs outgoing emails, making them verifiable and preventing tampering in transit.
Fix: Configure DKIM in your email provider (Google Workspace, Microsoft 365, etc.) and publish the TXT record they provide at {selector}._domainkey.fbi.gov
MTA-STS (email transport security)
No MTA-STS record found at _mta-sts.fbi.gov. Without it, email delivery to your domain could silently fall back to unencrypted connections.
Fix: Implement MTA-STS: add a TXT record at _mta-sts.fbi.gov with value "v=STSv1; id=YYYYMMDD01" and publish a policy file at https://mta-sts.fbi.gov/.well-known/mta-sts.txt
IPv6 support
Domain has an AAAA record — IPv6 is supported.
BIMI record
No BIMI record found. BIMI lets your brand logo appear in email clients that support it — a trust and branding signal for recipients.
Fix: BIMI requires DMARC with p=quarantine or p=reject. Then add a TXT record at default._bimi.fbi.gov: v=BIMI1; l=https://yourdomain.com/logo.svg
DNSSEC
DNSSEC could not be verified via this automated check (PHP DNS resolvers strip DNSSEC data). Check with your domain registrar or use dnsviz.net to verify.
SSL & HTTPS
100/100HTTPS / SSL enabled
The website is accessible over HTTPS.
SSL certificate valid
Certificate is valid and expires on 2026-06-11 (46 days left).
HTTP redirects to HTTPS
HTTP traffic is permanently (301) redirected to HTTPS.
HSTS header configured
Strict-Transport-Security header found with max-age=31536000. includeSubDomains is set.
No weak cipher suites
Server does not accept known weak cipher suites (RC4, 3DES, EXPORT, NULL).
TLS 1.0 and 1.1 disabled
Server only accepts TLS 1.2 or higher. Deprecated TLS versions are not supported.
Content & CMS
100/100No mixed content detected
No insecure HTTP resources (scripts, images, stylesheets) found in the page HTML.
CMS admin panel not publicly accessible
No publicly accessible CMS admin interface found at common paths.
CMS version not exposed
No CMS version information found in the page source.
Subresource Integrity (SRI)
No external scripts or stylesheets without Subresource Integrity hashes detected.
No open redirect
No open redirect detected via common redirect parameters.
Directory listing disabled
Directory listing is not enabled — files cannot be browsed directly.
Security Headers
75/100Server version not disclosed
The Server header does not expose version information.
Content-Security-Policy
CSP is set but weakened by 'unsafe-eval' in script-src. These directives allow inline scripts and effectively disable XSS injection protection.
Fix: Remove 'unsafe-inline' and 'unsafe-eval' from your CSP. Replace inline scripts with external files or use nonces/hashes. Test your policy at https://csp-evaluator.withgoogle.com/
X-Frame-Options
X-Frame-Options: SAMEORIGIN — protects against clickjacking.
X-Content-Type-Options
X-Content-Type-Options: nosniff is set — prevents MIME-type sniffing.
Referrer-Policy
Referrer-Policy: same-origin
Permissions-Policy
Permissions-Policy header found — browser feature access is restricted.
Cookie security flags
One or more cookies are missing security flags: __cf_bm (missing: SameSite).
Fix: Set HttpOnly (prevents JS access), Secure (HTTPS only), and SameSite=Lax or Strict on all cookies.
Cross-Origin-Opener-Policy
COOP: same-origin — protects against cross-origin window attacks and Spectre-based data leaks.
Cross-Origin-Embedder-Policy
COEP: require-corp — ensures all embedded resources opt-in to being loaded cross-origin.
Performance & SEO
50/100Fast server response time (TTFB)
Time To First Byte: 13 ms (measured from our scanner server) — excellent.
Response compression enabled
Compression is enabled (br) — reduces transfer size and speeds up page loads.
robots.txt present
No robots.txt file found.
Fix: Create a robots.txt file to guide search engine crawlers and prevent indexing of sensitive paths.
XML sitemap present
No sitemap.xml found at common locations (/sitemap.xml, /sitemap_index.xml).
Fix: Create and submit an XML sitemap to Google Search Console to improve search indexing.
security.txt present
No security.txt file found at /.well-known/security.txt or /security.txt.
Fix: Create a security.txt file (RFC 9116) at /.well-known/security.txt to provide security researchers with a responsible disclosure contact.
Critical issues (1)
What is this?
HTTP cookies can carry security flags: HttpOnly (prevents JavaScript from reading the cookie, blocking XSS-based session theft), Secure (transmits the cookie only over HTTPS, never plain HTTP), and SameSite (controls cross-site submission, blocking CSRF attacks).
Why does it matter?
Without HttpOnly, malicious scripts injected via XSS can steal session cookies. Without Secure, cookies can leak over HTTP redirects or mixed-content requests. Without SameSite, cookies are sent with cross-site requests, enabling CSRF attacks that make users perform actions without their knowledge.
How to fix it
Add all three flags when setting cookies: Set-Cookie: session=abc123; HttpOnly; Secure; SameSite=Lax PHP: session_set_cookie_params([ 'httponly' => true, 'secure' => true, 'samesite' => 'Lax', ]); Laravel: in config/session.php set: 'http_only' => true, 'secure' => true, 'same_site' => 'lax', Use SameSite=Lax for most sites. Use SameSite=Strict if cross-site links to your site don't need to carry the session.
Warnings (6)
What is this?
DKIM (DomainKeys Identified Mail) adds a cryptographic signature to every outgoing email. The signature is created with a private key on your mail server and verified by recipients using a public key published in DNS.
Why does it matter?
DKIM proves that an email actually came from your mail server and was not modified in transit. Without DKIM, anyone can send emails that appear to be from your domain (spoofing), and DMARC alignment checks will fail even if SPF passes.
How to fix it
DKIM is configured in your email provider, not directly in DNS. Here is the process: 1. Generate a DKIM key pair in your email provider: - Google Workspace: Admin console → Apps → Gmail → Authenticate email - Microsoft 365: Admin center → Settings → Domains → DKIM - Mailchimp/SendGrid/Mailjet: Each has a DKIM setup page in their dashboard 2. Copy the TXT record they provide and add it to your DNS: Name: selector._domainkey.yourdomain.com Value: v=DKIM1; k=rsa; p=MIGf... 3. Activate DKIM signing in your provider after publishing the DNS record. The selector name (e.g. 'google', 'selector1') comes from your email provider.
What is this?
MTA-STS (Mail Transfer Agent Strict Transport Security) is a standard that forces other mail servers to use encrypted TLS connections when delivering email to your domain. Without it, a network attacker could silently strip TLS from email in transit.
Why does it matter?
Email is delivered between servers using SMTP. By default, SMTP tries TLS but falls back to plaintext if TLS is not available — a downgrade attack. MTA-STS prevents this fallback, ensuring all email delivered to your domain is encrypted in transit.
How to fix it
Implementing MTA-STS requires two things: 1. A DNS TXT record at _mta-sts.yourdomain.com: v=STSv1; id=20240101001 2. A policy file hosted at: https://mta-sts.yourdomain.com/.well-known/mta-sts.txt Policy file content: version: STSv1 mode: enforce mx: mail.yourdomain.com max_age: 86400 Start with mode: testing to see reports before enforcing. Use mta-sts.io for a guided setup.
What is this?
Content Security Policy (CSP) is a browser security feature that lets you control which resources (scripts, styles, images, fonts) a page is allowed to load, and from which origins.
Why does it matter?
CSP is one of the most effective defences against Cross-Site Scripting (XSS) attacks. Without CSP, an attacker who injects malicious JavaScript into your page can load resources from anywhere, steal session cookies, or redirect users.
How to fix it
Add a Content-Security-Policy header. Start with a report-only policy to detect issues without breaking anything: Content-Security-Policy-Report-Only: default-src 'self'; script-src 'self'; style-src 'self' 'unsafe-inline'; Once tested, switch to enforcing: Content-Security-Policy: default-src 'self'; ... CSP policies can be complex for sites with third-party scripts. Use https://csp-evaluator.withgoogle.com/ to evaluate your policy.
What is this?
robots.txt is a plain text file at the root of your website that tells search engine crawlers which pages they are and aren't allowed to index.
Why does it matter?
Without a robots.txt, crawlers may index admin panels, staging areas, duplicate content, or other pages that should not appear in search results. A well-configured robots.txt also prevents crawl budget waste on unimportant pages.
How to fix it
Create a file at https://yourdomain.com/robots.txt with at minimum: User-agent: * Disallow: Sitemap: https://yourdomain.com/sitemap.xml To block specific paths: User-agent: * Disallow: /admin/ Disallow: /private/ Allow: / WordPress: generated automatically. Check Settings > Reading. Laravel: create public/robots.txt manually.
What is this?
An XML sitemap is a file that lists all the important URLs on your website, helping search engines discover and index your pages more efficiently.
Why does it matter?
Search engines may miss pages that are not linked from anywhere (orphan pages) or pages deep in your site structure. A sitemap ensures they are found and indexed. It also allows you to signal content priority and update frequency.
How to fix it
Create an XML sitemap at https://yourdomain.com/sitemap.xml WordPress: install Yoast SEO or use the built-in sitemap at /wp-sitemap.xml Laravel: use spatie/laravel-sitemap package Static sites: generate with a sitemap generator tool After creating your sitemap, submit it to: - Google Search Console: search.google.com/search-console - Bing Webmaster Tools: bing.com/webmasters Also reference it in your robots.txt: Sitemap: https://yourdomain.com/sitemap.xml
Get this report emailed to you
Create a free account to save your scan results, monitor your sites, and get alerted when your score drops.