Scan failed
We could not scan www.audemarspiguet.com. The website may be unreachable.
Try another URLSecurity report for
www.audemarspiguet.com
Scanned 1 hour ago
Executive Summary
PDF PROWe performed a comprehensive security analysis of www.audemarspiguet.com across 5 categories. The website received an overall score of 72/100 (grade B-), with 2 critical issues, 7 warnings, and 21 passed checks.
Overall assessment: www.audemarspiguet.com has a reasonable security foundation but there is clear room for improvement. Several issues were identified that could expose the website or its users to unnecessary risk. We recommend addressing the critical issues first, followed by the warnings outlined below.
Top priority fixes:
Strong areas
SSL & HTTPS
Content & CMS
Needs improvement
DNS & Email Security
Needs work
Performance & SEO
Security Headers
Website Health Check
Simple overview for everyoneIs my website safe for visitors?
Not fully — your website is missing important security protections that keep visitors safe.
Can my website be found by Google?
There are issues — search engines may have trouble finding or ranking your website properly.
Is my email protected against spoofing?
Yes — your domain has email authentication records (SPF/DMARC) that prevent others from sending fake emails on your behalf.
Is my website leaking sensitive data?
No leaks detected — configuration files and sensitive data appear to be properly protected.
Does my website respect visitor privacy?
Yes — a privacy policy and cookie consent appear to be in place.
Trust & WHOIS
See domain age, registrar, expiry date, server location, and reputation checks across security databases.
Malware & Reputation
Check if your site is flagged by malware databases, blacklists, and antivirus vendors worldwide.
Advanced Security Checks
Detect open ports, exposed files, API vulnerabilities, TLS weaknesses, and subdomain takeover risks.
Privacy & GDPR
Analyze cookie consent, privacy policy presence, third-party trackers, and GDPR compliance signals.
Quality & Accessibility
Check accessibility compliance, robots.txt, branding, broken links, and carbon footprint.
Unlock the full security report
This Quick Scan covers 5 categories. Upgrade to Pro for OWASP Top 10 analysis, malware detection, exposed files, and 15 more scanners.
Full report
DNS & Email Security
75/100SPF record configured
SPF record found: "v=spf1 include:_u.audemarspiguet.com._spf.smart.ondmarc.com ~all".
DMARC record configured
DMARC record found with policy "reject": "v=DMARC1; p=reject; pct=100; sp=reject; rua=mailto:a6a3a5b1@inbox.eu.redsift.cloud,mailto:dmarc@audemarspiguet.com; ruf=mailto:a6a3a5b1@inbox.eu.redsift.cloud,mailto:dmarc@audemarspiguet.com; adkim=r; aspf=r; fo=1; rf=afrf; ri=3600".
CAA record configured
No CAA record found. Any Certificate Authority can issue SSL certs for your domain.
Fix: Add a CAA DNS record, e.g.: 0 issue "letsencrypt.org" to restrict SSL issuance.
DKIM record configured
DKIM record found (selector "selector1") — outgoing emails are cryptographically signed.
MTA-STS (email transport security)
MTA-STS record found — sending mail servers are required to use TLS when delivering email to this domain, preventing downgrade attacks.
IPv6 support
No AAAA record found. The domain is IPv4-only.
Fix: Add an AAAA record to support IPv6. Most modern hosting providers and CDNs assign IPv6 addresses automatically.
BIMI record
BIMI record found — your brand logo can appear in supporting email clients (Gmail, Apple Mail, Yahoo) next to emails from your domain.
DNSSEC
DNSSEC could not be verified via this automated check (PHP DNS resolvers strip DNSSEC data). Check with your domain registrar or use dnsviz.net to verify.
SSL & HTTPS
100/100HTTPS / SSL enabled
The website is accessible over HTTPS.
SSL certificate valid
Certificate is valid and expires on 2026-10-25 (165 days left).
HTTP redirects to HTTPS
HTTP traffic is permanently (301) redirected to HTTPS.
HSTS header configured
Strict-Transport-Security header found with max-age=63072000. includeSubDomains is set.
No weak cipher suites
Server does not accept known weak cipher suites (RC4, 3DES, EXPORT, NULL).
TLS 1.0 and 1.1 disabled
Server only accepts TLS 1.2 or higher. Deprecated TLS versions are not supported.
Content & CMS
88/100No mixed content detected
No insecure HTTP resources (scripts, images, stylesheets) found in the page HTML.
CMS admin panel not publicly accessible
No publicly accessible CMS admin interface found at common paths.
CMS version not exposed
No CMS version information found in the page source.
Subresource Integrity (SRI)
2 of 2 external script(s)/stylesheet(s) load without an integrity= hash. If the CDN is compromised, malicious code could be silently injected into your pages.
Fix: Add integrity= and crossorigin= attributes to external <script> and <link> tags. Generate hashes at https://www.srihash.org/
No open redirect
No open redirect detected via common redirect parameters.
Directory listing disabled
Directory listing is not enabled — files cannot be browsed directly.
Security Headers
50/100Server version not disclosed
The Server header does not expose version information.
Content-Security-Policy
CSP is set but weakened by 'unsafe-inline' and 'unsafe-eval' in script-src. These directives allow inline scripts and effectively disable XSS injection protection.
Fix: Remove 'unsafe-inline' and 'unsafe-eval' from your CSP. Replace inline scripts with external files or use nonces/hashes. Test your policy at https://csp-evaluator.withgoogle.com/
X-Frame-Options
X-Frame-Options: SAMEORIGIN — protects against clickjacking.
X-Content-Type-Options
X-Content-Type-Options: nosniff is set — prevents MIME-type sniffing.
Referrer-Policy
No Referrer-Policy header found.
Fix: Add Referrer-Policy: strict-origin-when-cross-origin to control how much referrer info is sent.
Permissions-Policy
No Permissions-Policy header found.
Fix: Add a Permissions-Policy header to restrict browser features like camera, microphone, and geolocation.
Cross-Origin-Opener-Policy
No Cross-Origin-Opener-Policy (COOP) header found. Note: COOP can break popup-based flows (payments, OAuth) and browser back/forward cache.
Fix: Consider adding Cross-Origin-Opener-Policy: same-origin if your site does not use cross-origin popups.
Cross-Origin-Embedder-Policy
No Cross-Origin-Embedder-Policy (COEP) header found. Note: COEP breaks external embeds (YouTube, maps, ads) that don't send CORP headers.
Fix: Consider adding Cross-Origin-Embedder-Policy: require-corp only if your site does not embed third-party content.
X-XSS-Protection (deprecated)
X-XSS-Protection: 1; mode=block — Note: this header is deprecated and ignored by modern browsers. Rely on CSP instead.
Performance & SEO
25/100Fast server response time (TTFB)
Time To First Byte: 126 ms (measured from our scanner server) — excellent.
Response compression enabled
No gzip or Brotli compression detected.
Fix: Enable gzip or Brotli compression on your web server. This typically reduces HTML/CSS/JS size by 60-80%.
robots.txt present
No robots.txt file found.
Fix: Create a robots.txt file to guide search engine crawlers and prevent indexing of sensitive paths.
XML sitemap present
No sitemap.xml found at common locations (/sitemap.xml, /sitemap_index.xml).
Fix: Create and submit an XML sitemap to Google Search Console to improve search indexing.
security.txt present
No security.txt file found at /.well-known/security.txt or /security.txt.
Fix: Create a security.txt file (RFC 9116) at /.well-known/security.txt to provide security researchers with a responsible disclosure contact.
Critical issues (2)
What is this?
The Referrer-Policy header controls how much information about the originating page is included in the Referer header when a user navigates away from your site or when resources are loaded.
Why does it matter?
Without a Referrer-Policy, the full URL of the current page (which may include session tokens, user IDs, or sensitive paths) is sent to external sites in the Referer header. This can leak private information to third-party analytics, CDN providers, or ad networks.
How to fix it
Recommended value: Referrer-Policy: strict-origin-when-cross-origin (sends origin only for cross-origin requests, full URL for same-origin) Nginx: add_header Referrer-Policy "strict-origin-when-cross-origin" always; Apache: Header always set Referrer-Policy "strict-origin-when-cross-origin" Alternatives: no-referrer (most private), same-origin (no cross-origin referrer).
What is this?
Response compression (gzip or Brotli) reduces the size of HTML, CSS, JavaScript and other text-based responses before sending them over the network.
Why does it matter?
Compression typically reduces text file sizes by 60–80%. A 200 KB JavaScript file becomes ~50 KB. This directly reduces page load time, especially on slower connections, and reduces bandwidth costs.
How to fix it
Nginx: gzip on; gzip_types text/plain text/css application/javascript application/json; gzip_min_length 1000; For Brotli (better compression, requires ngx_brotli module): brotli on; brotli_types text/plain text/css application/javascript; Apache (.htaccess): AddOutputFilterByType DEFLATE text/html text/css application/javascript Cloudflare: enables compression automatically — no server config needed.
Warnings (7)
What is this?
CAA (Certification Authority Authorization) is a DNS record that specifies which Certificate Authorities (CAs) are allowed to issue SSL/TLS certificates for your domain.
Why does it matter?
Without CAA records, any of the hundreds of trusted CAs worldwide can issue a certificate for your domain. A compromised or rogue CA could issue a fraudulent certificate for your domain, enabling MITM attacks. CAA limits this risk to your chosen CA(s).
How to fix it
Add CAA records to your DNS. Example for Let\'s Encrypt only: 0 issue "letsencrypt.org" For multiple CAs (e.g. Let\'s Encrypt + DigiCert): 0 issue "letsencrypt.org" 0 issue "digicert.com" To also allow wildcard certificates: 0 issuewild "letsencrypt.org" For email notifications on unauthorized issuance attempts: 0 iodef "mailto:security@yourdomain.com" Check current CAA records at: sslmate.com/caa
What is this?
Subresource Integrity (SRI) is a browser security feature that lets you specify a cryptographic hash for external scripts and stylesheets. The browser refuses to execute the resource if its content does not match the hash.
Why does it matter?
If a CDN you rely on is compromised (a real and recurring attack vector), an attacker can replace your JavaScript library with malicious code that steals user data, injects cryptomining scripts, or performs other attacks. SRI prevents this by making the browser verify the file has not been altered.
How to fix it
Add integrity= and crossorigin= attributes to your external resources: <script src="https://cdn.jsdelivr.net/npm/jquery@3.7.1/dist/jquery.min.js" integrity="sha256-/JqT3SQfawRcv/BIHPThkBvs0OEvtFFmqPF/lYI/Cxo=" crossorigin="anonymous" ></script> Generate hashes for any URL at: https://www.srihash.org/ For build tools, use webpack-subresource-integrity or vite-plugin-sri to add hashes automatically during builds.
What is this?
Content Security Policy (CSP) is a browser security feature that lets you control which resources (scripts, styles, images, fonts) a page is allowed to load, and from which origins.
Why does it matter?
CSP is one of the most effective defences against Cross-Site Scripting (XSS) attacks. Without CSP, an attacker who injects malicious JavaScript into your page can load resources from anywhere, steal session cookies, or redirect users.
How to fix it
Add a Content-Security-Policy header. Start with a report-only policy to detect issues without breaking anything: Content-Security-Policy-Report-Only: default-src 'self'; script-src 'self'; style-src 'self' 'unsafe-inline'; Once tested, switch to enforcing: Content-Security-Policy: default-src 'self'; ... CSP policies can be complex for sites with third-party scripts. Use https://csp-evaluator.withgoogle.com/ to evaluate your policy.
What is this?
Permissions-Policy (formerly Feature-Policy) lets you control which browser features and APIs your site is allowed to use, and whether third-party content embedded in iframes can access them.
Why does it matter?
Without this header, embedded third-party scripts or iframes could theoretically request access to the camera, microphone, geolocation, payment APIs, and more. Restricting these features reduces your attack surface.
How to fix it
Example header that disables features not needed for most sites: Permissions-Policy: camera=(), microphone=(), geolocation=(), payment=() Nginx: add_header Permissions-Policy "camera=(), microphone=(), geolocation=()" always; Apache: Header always set Permissions-Policy "camera=(), microphone=(), geolocation=()" Only disable features you genuinely don't use. Adding this header is a low-effort, high-value improvement.
What is this?
robots.txt is a plain text file at the root of your website that tells search engine crawlers which pages they are and aren't allowed to index.
Why does it matter?
Without a robots.txt, crawlers may index admin panels, staging areas, duplicate content, or other pages that should not appear in search results. A well-configured robots.txt also prevents crawl budget waste on unimportant pages.
How to fix it
Create a file at https://yourdomain.com/robots.txt with at minimum: User-agent: * Disallow: Sitemap: https://yourdomain.com/sitemap.xml To block specific paths: User-agent: * Disallow: /admin/ Disallow: /private/ Allow: / WordPress: generated automatically. Check Settings > Reading. Laravel: create public/robots.txt manually.
What is this?
An XML sitemap is a file that lists all the important URLs on your website, helping search engines discover and index your pages more efficiently.
Why does it matter?
Search engines may miss pages that are not linked from anywhere (orphan pages) or pages deep in your site structure. A sitemap ensures they are found and indexed. It also allows you to signal content priority and update frequency.
How to fix it
Create an XML sitemap at https://yourdomain.com/sitemap.xml WordPress: install Yoast SEO or use the built-in sitemap at /wp-sitemap.xml Laravel: use spatie/laravel-sitemap package Static sites: generate with a sitemap generator tool After creating your sitemap, submit it to: - Google Search Console: search.google.com/search-console - Bing Webmaster Tools: bing.com/webmasters Also reference it in your robots.txt: Sitemap: https://yourdomain.com/sitemap.xml
Get this report emailed to you
Create a free account to save your scan results, monitor your sites, and get alerted when your score drops.