webscan

Webscan tries to retrieve as much information from URLs and IPs as is possible from an external perspective.
It covers
- DNS configuration
- Domain and Nameserver ownerships
- IPv4 and IPv6 availability
- IP address ownerships
- Blacklisting status
- Open ports
- SSL validity
- SSL configuration safety
- http/s configuration with redirects
- host-headers
- cookies
- html, js, css sizes
- ...
of a specified url or ip and gives improvement recommendations based on best-practices.
Usage
webscan google.com # Scan domain and website
webscan 192.168.0.1 # Scan IP and website
webscan https://github.com/thetillhoff/webscan # Scan domain and website at specific path
webscan --help # Learn more about running specific scans
Installation
If you're feeling fancy:
curl -s https://raw.githubusercontent.com/thetillhoff/webscan/main/install.sh | sh
or manually from https://github.com/thetillhoff/webscan/v3/releases/latest.
Features
DNS
Display dns information about the provided URL, and give improvement recommendations.
-
This is skipped if the input is an ipv4 or ipv6 address
-
Check who is the owner of the Domain via RDAP (not supported for country-TLDs)
-
Check who is the owner of the DNS zone (== nameserver owner)
-
Follow CNAMEs
-
DNS records overview
-
Long DNS name (requires --opinionated flag)
-
DNS best practices (TTL values, SOA, ...)
-
DNSSEC
-
Warn if there are more than one CNAME redirects
-
Detect CNAME loops
-
Domain blacklist detection
-
Scan DNS of domain even if input is domain with path (like "github.com/webscan")
-
Specify a custom dns server with the --dns <dns server location> option.
DNS mail security
- [~] SPF
- Verify usage of allowed mechanisms, modifiers, qualifiers.
- Verify contained ipv4 addresses.
- Verify contained ipv6 addresses.
- Verify as described in spec.
- Recursive check of any referenced external information.
- [~] DKIM
- TXT variant detection
- CNAME variant detection
- TXT variant verification
- CNAME variant recursive check
- [~] DMARC
- TXT variant detection
- CNAME variant detection
- TXT variant verification
- CNAME variant recursive check
- MX blacklist detection
Subdomain finder
- Search for subdomains of the provided domains and provide a list of them.
- Search for subdomains in the subject and alternate name list of the original domain tls certificate.
- Check other DNS entries (like PTR), certificate pointers, SPF record, certificate logs, reverse-ip-lookups
IPv6 readiness
- Check if both ipv4 and ipv6 entries were found.
- IPv4 is necessary to stay backwards compatible.
- IPv6 is recommended to be IPv6 ready.
IP analysis
- Check who is the hoster of the IP address via RDAP (successor of whois) - like AWS, Azure, GCP, ...
- Check if any IP (v4 and v6) of the domain is blacklisted.
Open ports
- Check all found ipv4 and ipv6 entries for relevant open ports. Examples for relevant ports are SSH, FTP, SMB, SMTP, HTTP, POSTGRES
- Check whether FTP is disabled completely (only use SFTP or FTPS)
- Check whether SSH has password auth disabled and uses a secure configuration
- Check ports in parallel, since the connection timeout is set to 2s, which can add up quite much.
- Check if open ports match across all IPs.
- If http detection feature is enabled, check HTTP and HTTPS ports even if this feature is not enabled.
SSL/TLS check
-
Validate certificate only if port 443 is open.
-
Check the validity of the ssl certificate. Subject, date, chain, ciphers, tls-min-version (if valid, but not recommended, don't fail, but print warning/s instead).
-
Write tests against badssl.com
-
SSL is not recommended
-
TLS 1.0 and TLS 1.1 are not recommended, only TLS 1.2 & 1.3 are okay
-
TLS 1.3 should be supported
-
cipher recommendations like
- Recommending against 3DES, as it's vulnerable to birthday attacks (https://sweet32.info/).
- Recommending against RC4, as it's exploitable biases can lead to plaintext recovery without side channels (https://www.rc4nomore.com/).
- Recommending against CBC, as it seems fundamentally flawed since the Lucky13 vulnerability was discovered (https://en.wikipedia.org/wiki/Lucky_Thirteen_attack).
- Keep in mind ECDH_ ciphers don't support Perfect Forward Secrecy and shouldn't be used after 2026.
- Keep in mind DH_ ciphers don't support Perfect Forward Secrecy and shouldn't be used after 2026.
-
check who's the issuer of the certificate. If it's one of the most known paid providers, recommend to use a free one, like letsencrypt.
HTTP detection
By default webscan assumes you're using https. Yet, it will check whether it's available via http as well.
- Optionally follow HTTP redirects (status codes 30x)
- If http is available, it should be used for redirects only.
- If https is availabe, it should either redirect or respond with 200 status code.
- If both http and https are available
- Check which http versions are supported by the webserver (like HTTP/1.1, HTTP/2, HTTP/3 aka QUIC)
[x] Analyze host-headers of the response and recommend best-practices.
- Check HTTP Strict Transport Security (HSTS / STS). This defeats attacks such as SSL Stripping, and also avoids the round-trip cost of the 301 redirect to Redirect HTTP to HTTPS.
- CSP header settings (have one is the minimum requirement here)
- Scan cookies
- amount
- length
- used characters
HTML content
Print recommendations on the html code.
-
Scan HTML contents of path even if input is domain with path (like "github.com/webscan")
-
Check if compression is enabled
-
HTML validation -> It has to be parsable to look further into the details
-
HTML5 check if size of body > 0
-
<!DOCTYPE html> is first node
-
<html lang="en-US"> html node has lang attribute set (it's value isn't validated)
-
[~] HTML parsing - requires go-colly for finding references to other files
- [~] check html
- size < 200kb
- validation
- minification
- html5 validation
- [~] check css (if size of html > 0)
- size
- validation
- minification
- feature support recommendation
- [~] check js (if size of html > 0)
- size
- validation
- minification
- feature support recommendation
- Outdated Javascript dependencies referenced
- check images (if size of html > 0)
- size (< 500kb)
- image format -> webp
- images one-by-one, shouldn't be too large each, plus not too large in total
- HTML accessibility check
- Don't use fully-qualified URLs in links on the same page ("https://example.com/b" when you're already on "https://example.com/a" -> use "/b" instead as it's smaller and less error-prone).
- Check that all links on the page also use https (or relative to the current one).
- But due to mixed content security concerns, an HTTP