PHANTOMJS
HIGH RISK⚡ AUTOMATION AGENTLegacy headless browser — deprecated but still seen in scraping and legacy automation
📡 PHANTOMJS USER-AGENT STRING
Mozilla/5.0 (Unknown; Linux x86_64) AppleWebKit/538.1 (KHTML, like Gecko) PhantomJS/2.1.1 Safari/538.1
This is the User-Agent header sent by PhantomJS in HTTP requests. Use this to identify PhantomJS in your server access logs.
📋 ABOUT PHANTOMJS
PhantomJS is a legacy headless browser based on WebKit, officially archived and unmaintained since 2018. Despite being deprecated, PhantomJS continues to appear in web server logs worldwide due to its presence in legacy scraping systems, older automation pipelines, and unmaintained bot frameworks that were built before modern alternatives like Puppeteer and Playwright existed.
PhantomJS is relatively easy to detect compared to modern headless browsers. Its outdated WebKit engine (version 538.1) lacks support for many modern web APIs, creating detectable gaps in browser feature support. Additionally, PhantomJS exposes unique JavaScript globals (window.callPhantom, window._phantom) that are not present in standard browsers.
NORAD.io classifies PhantomJS traffic as high risk because its continued use in 2025-2026 typically indicates unmaintained scraping infrastructure or deliberately low-sophistication attack tools. Legitimate automation use cases have overwhelmingly migrated to Puppeteer, Playwright, or Selenium, making PhantomJS presence a strong signal for unwanted automated access.
🎯 HOW TO DETECT PHANTOMJS
- ▸Default User-Agent contains 'PhantomJS' — easily spoofed
- ▸Uses an outdated WebKit engine (538.1) — detectable via feature support
- ▸Missing modern browser APIs (IntersectionObserver, Web Components, etc.)
- ▸window.callPhantom and window._phantom are PhantomJS-specific globals
- ▸Outdated TLS capabilities — limited cipher suite support
- ▸Canvas and WebGL fingerprints are distinctive due to the old rendering engine
🔄 CRAWL BEHAVIOR
Full JavaScript rendering via WebKit engine. No longer maintained (archived in 2018) but still actively used in legacy scraping systems. Can emulate various screen sizes and user agents. No built-in crawl politeness.
Originally designed for headless website testing and screenshot generation. Now primarily seen in legacy scraping systems, automated testing pipelines, and older bot frameworks that haven't migrated to modern alternatives.
🤖 ROBOTS.TXT CONFIGURATION
# PhantomJS does not check robots.txt. # Detection via User-Agent 'PhantomJS' in default configuration. # Consider blocking — legitimate use cases have largely migrated to Puppeteer/Playwright.
⚠ PhantomJS may not fully respect robots.txt. Consider supplementing with IP-level blocking or bot detection middleware.
🗺️ WHERE IS PHANTOMJS ACTIVE?
⚠️ RELATED THREATS
🔗 RELATED BOTS
Unknown · Automated headless Chrome browsers — commonly used for scraping, testing, and bot activity
Google · Google's Node.js browser automation library — widely used for scraping and testing
Microsoft · Microsoft's cross-browser automation framework — used for testing and scraping
Open Source · The original browser automation framework — still widely used for testing and scraping
📂 MORE ⚡ AUTOMATION AGENTS
📚 RELATED GUIDES
PROTECT YOUR WEBSITE
Deploy SiteTrust to monitor and control AI bot access to your site with the Agent Passport Standard.
INSTALL SITETRUST →