Scan failed

We could not scan www.shadowbit.cc. The website may be unreachable.

Try another URL

Security report for

www.shadowbit.cc

Scanned 1 hour ago

Cached result
A newer scan is available. View latest →
0 /100
A-
Overall grade
Better than 92%

Executive Summary

PDF PRO

We performed a comprehensive security analysis of www.shadowbit.cc across 5 categories. The website received an overall score of 85/100 (grade A-), with 0 critical issues, 7 warnings, and 23 passed checks.

Overall assessment: www.shadowbit.cc demonstrates a strong security posture. The website follows most security best practices and is well-configured. Minor improvements are possible but no urgent issues were found. Continue monitoring regularly to maintain this level of security.

Strong areas

DNS & Email Security

SSL & HTTPS

Content & CMS

Security Headers

Needs work

Performance & SEO

Website Health Check

Simple overview for everyone

Is my website safe for visitors?

Yes — your website uses encryption and has security protections in place.

Good

Can my website be found by Google?

Yes — your website is accessible to search engines and loads at a reasonable speed.

Good

Is my email protected against spoofing?

Yes — your domain has email authentication records (SPF/DMARC) that prevent others from sending fake emails on your behalf.

Good

Is my website leaking sensitive data?

No leaks detected — configuration files and sensitive data appear to be properly protected.

Good

Does my website respect visitor privacy?

Yes — a privacy policy and cookie consent appear to be in place.

Good

Fixed

DKIM record configured
Permissions-Policy

Trust & WHOIS

See domain age, registrar, expiry date, server location, and reputation checks across security databases.

Domain Age WHOIS Data Server Location Reputation Check Expiry Alert

Malware & Reputation

Check if your site is flagged by malware databases, blacklists, and antivirus vendors worldwide.

VirusTotal URLhaus Spamhaus PhishTank Cloudflare DNS

Advanced Security Checks

Detect open ports, exposed files, API vulnerabilities, TLS weaknesses, and subdomain takeover risks.

Open Ports Exposed Files API Security TLS Ciphers Subdomain Takeover

Privacy & GDPR

Analyze cookie consent, privacy policy presence, third-party trackers, and GDPR compliance signals.

Cookie Consent Privacy Policy Tracker Detection GDPR Compliance

Quality & Accessibility

Check accessibility compliance, robots.txt, branding, broken links, and carbon footprint.

Accessibility Robots & SEO Branding Broken Links Carbon Footprint
PDF PRO

Unlock the full security report

This Quick Scan covers 5 categories. Upgrade to Pro for OWASP Top 10 analysis, malware detection, exposed files, and 15 more scanners.

Full report

DNS & Email Security

100/100

SPF record configured

SPF record found: "v=spf1 -all".

DMARC record configured

DMARC record found with policy "reject": "v=DMARC1; p=reject; adkim=s; aspf=s;".

CAA record configured

CAA record found — only authorized Certificate Authorities can issue SSL certificates for this domain.

DKIM record configured

DKIM record found (selector "google") — outgoing emails are cryptographically signed.

MTA-STS (email transport security)

No MTA-STS record found at _mta-sts.shadowbit.cc. Without it, email delivery to your domain could silently fall back to unencrypted connections.

Fix: Implement MTA-STS: add a TXT record at _mta-sts.shadowbit.cc with value "v=STSv1; id=YYYYMMDD01" and publish a policy file at https://mta-sts.shadowbit.cc/.well-known/mta-sts.txt

IPv6 support

Domain has an AAAA record — IPv6 is supported.

BIMI record

No BIMI record found. BIMI lets your brand logo appear in email clients that support it — a trust and branding signal for recipients.

Fix: BIMI requires DMARC with p=quarantine or p=reject. Then add a TXT record at default._bimi.shadowbit.cc: v=BIMI1; l=https://yourdomain.com/logo.svg

DNSSEC

DNSSEC could not be verified via this automated check (PHP DNS resolvers strip DNSSEC data). Check with your domain registrar or use dnsviz.net to verify.

SSL & HTTPS

85/100

HTTPS / SSL enabled

The website is accessible over HTTPS.

SSL certificate valid

Certificate is valid and expires on 2026-07-06 (63 days left).

HTTP redirects to HTTPS

HTTP redirects to HTTPS, but not via a fully permanent redirect chain.

Fix: Use 301 permanent redirects at every step from HTTP to HTTPS for better SEO and caching.

HSTS header configured

HSTS header present but max-age is only 15552000 seconds (minimum recommended: 31536000).

Fix: Set Strict-Transport-Security: max-age=31536000; includeSubDomains

No weak cipher suites

Server does not accept known weak cipher suites (RC4, 3DES, EXPORT, NULL).

TLS 1.0 and 1.1 disabled

Server only accepts TLS 1.2 or higher. Deprecated TLS versions are not supported.

Content & CMS

100/100

No mixed content detected

No insecure HTTP resources (scripts, images, stylesheets) found in the page HTML.

CMS admin panel not publicly accessible

No publicly accessible CMS admin interface found at common paths.

CMS version not exposed

No CMS version information found in the page source.

Subresource Integrity (SRI)

No external scripts or stylesheets without Subresource Integrity hashes detected.

No open redirect

No open redirect detected via common redirect parameters.

Directory listing disabled

Directory listing is not enabled — files cannot be browsed directly.

Security Headers

85/100

Server version not disclosed

The Server header does not expose version information.

Content-Security-Policy

CSP is set but weakened by 'unsafe-inline' in script-src. These directives allow inline scripts and effectively disable XSS injection protection.

Fix: Remove 'unsafe-inline' and 'unsafe-eval' from your CSP. Replace inline scripts with external files or use nonces/hashes. Test your policy at https://csp-evaluator.withgoogle.com/

X-Frame-Options

X-Frame-Options: DENY — protects against clickjacking.

X-Content-Type-Options

X-Content-Type-Options: nosniff is set — prevents MIME-type sniffing.

Referrer-Policy

Referrer-Policy: strict-origin-when-cross-origin

Permissions-Policy

Permissions-Policy header found — browser feature access is restricted.

Cookie security flags

All cookies are set with HttpOnly, Secure, and SameSite flags.

Cross-Origin-Opener-Policy

No Cross-Origin-Opener-Policy (COOP) header found. Note: COOP can break popup-based flows (payments, OAuth) and browser back/forward cache.

Fix: Consider adding Cross-Origin-Opener-Policy: same-origin if your site does not use cross-origin popups.

Cross-Origin-Embedder-Policy

No Cross-Origin-Embedder-Policy (COEP) header found. Note: COEP breaks external embeds (YouTube, maps, ads) that don't send CORP headers.

Fix: Consider adding Cross-Origin-Embedder-Policy: require-corp only if your site does not embed third-party content.

Server: cloudflare
Referrer-Policy: strict-origin-when-cross-origin
X-Frame-Options: DENY
Permissions-Policy: accelerometer=(), camera=(), geolocation=(), gyroscope=(), magnetometer=(), microphone=(), payment=(), usb=(), interest-cohort=()
X-Content-Type-Options: nosniff
Content-Security-Policy: default-src 'self'; script-src 'self' 'unsafe-inline'; style-src 'self' 'unsafe-inline'; img-src 'self' data: https:; font-src 'self' data:; connect-src 'self'; object-src 'none'; base-uri 'self'; form-action 'self'; frame-ancestors 'none'; upgrade-insecure-requests
Strict-Transport-Security: max-age=15552000; includeSubDomains

Performance & SEO

50/100

Fast server response time (TTFB)

Time To First Byte: 81 ms (measured from our scanner server) — excellent.

Response compression enabled

Compression is enabled (br) — reduces transfer size and speeds up page loads.

robots.txt present

No robots.txt file found.

Fix: Create a robots.txt file to guide search engine crawlers and prevent indexing of sensitive paths.

XML sitemap present

No sitemap.xml found at common locations (/sitemap.xml, /sitemap_index.xml).

Fix: Create and submit an XML sitemap to Google Search Console to improve search indexing.

security.txt present

No security.txt file found at /.well-known/security.txt or /security.txt.

Fix: Create a security.txt file (RFC 9116) at /.well-known/security.txt to provide security researchers with a responsible disclosure contact.

Warnings (7)

What is this?

MTA-STS (Mail Transfer Agent Strict Transport Security) is a standard that forces other mail servers to use encrypted TLS connections when delivering email to your domain. Without it, a network attacker could silently strip TLS from email in transit.

Why does it matter?

Email is delivered between servers using SMTP. By default, SMTP tries TLS but falls back to plaintext if TLS is not available — a downgrade attack. MTA-STS prevents this fallback, ensuring all email delivered to your domain is encrypted in transit.

How to fix it

Implementing MTA-STS requires two things: 1. A DNS TXT record at _mta-sts.yourdomain.com: v=STSv1; id=20240101001 2. A policy file hosted at: https://mta-sts.yourdomain.com/.well-known/mta-sts.txt Policy file content: version: STSv1 mode: enforce mx: mail.yourdomain.com max_age: 86400 Start with mode: testing to see reports before enforcing. Use mta-sts.io for a guided setup.

What is this?

An HTTP to HTTPS redirect automatically sends visitors who type http:// (or click an old link) to the secure https:// version of your site.

Why does it matter?

If HTTP is not redirected, some visitors may unknowingly browse your site without encryption. It also causes duplicate content issues for SEO since the same page exists on both http:// and https://.

How to fix it

Add a 301 redirect in your server config: Nginx: return 301 https://$host$request_uri; Apache: Redirect permanent / https://yourdomain.com/ Or in .htaccess: RewriteEngine On RewriteCond %{HTTPS} off RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

What is this?

HTTP Strict Transport Security (HSTS) is a response header that tells browsers to only ever connect to your site over HTTPS — even if the user types http:// or clicks an http:// link. The browser enforces this locally for the duration of max-age.

Why does it matter?

Even with an HTTP redirect in place, the very first request could go over HTTP before being redirected. A network attacker could intercept that first request (SSL stripping attack). HSTS prevents this by making the browser upgrade to HTTPS before making any request.

How to fix it

Add this header to your HTTPS responses: Strict-Transport-Security: max-age=31536000; includeSubDomains Nginx: add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always; Apache: Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains" Only add HSTS after you are certain your entire site works over HTTPS, including all subdomains if you use includeSubDomains.

What is this?

Content Security Policy (CSP) is a browser security feature that lets you control which resources (scripts, styles, images, fonts) a page is allowed to load, and from which origins.

Why does it matter?

CSP is one of the most effective defences against Cross-Site Scripting (XSS) attacks. Without CSP, an attacker who injects malicious JavaScript into your page can load resources from anywhere, steal session cookies, or redirect users.

How to fix it

Add a Content-Security-Policy header. Start with a report-only policy to detect issues without breaking anything: Content-Security-Policy-Report-Only: default-src 'self'; script-src 'self'; style-src 'self' 'unsafe-inline'; Once tested, switch to enforcing: Content-Security-Policy: default-src 'self'; ... CSP policies can be complex for sites with third-party scripts. Use https://csp-evaluator.withgoogle.com/ to evaluate your policy.

What is this?

robots.txt is a plain text file at the root of your website that tells search engine crawlers which pages they are and aren't allowed to index.

Why does it matter?

Without a robots.txt, crawlers may index admin panels, staging areas, duplicate content, or other pages that should not appear in search results. A well-configured robots.txt also prevents crawl budget waste on unimportant pages.

How to fix it

Create a file at https://yourdomain.com/robots.txt with at minimum: User-agent: * Disallow: Sitemap: https://yourdomain.com/sitemap.xml To block specific paths: User-agent: * Disallow: /admin/ Disallow: /private/ Allow: / WordPress: generated automatically. Check Settings > Reading. Laravel: create public/robots.txt manually.

What is this?

An XML sitemap is a file that lists all the important URLs on your website, helping search engines discover and index your pages more efficiently.

Why does it matter?

Search engines may miss pages that are not linked from anywhere (orphan pages) or pages deep in your site structure. A sitemap ensures they are found and indexed. It also allows you to signal content priority and update frequency.

How to fix it

Create an XML sitemap at https://yourdomain.com/sitemap.xml WordPress: install Yoast SEO or use the built-in sitemap at /wp-sitemap.xml Laravel: use spatie/laravel-sitemap package Static sites: generate with a sitemap generator tool After creating your sitemap, submit it to: - Google Search Console: search.google.com/search-console - Bing Webmaster Tools: bing.com/webmasters Also reference it in your robots.txt: Sitemap: https://yourdomain.com/sitemap.xml

Get this report emailed to you

Create a free account to save your scan results, monitor your sites, and get alerted when your score drops.

Create free account

Show visitors your security score with an embeddable badge. It updates automatically when you rescan.

WebCheckApp security badge Preview
<a href="https://webcheckapp.com/scan/BhVMfuRfsmyVxW1r">
  <img src="https://webcheckapp.com/scan/BhVMfuRfsmyVxW1r/badge" alt="Security score: 85/100">
</a>