MJ12BOT

LOW RISK📊 SEO & DATA SCRAPER

Majestic's SEO crawler — builds the world's largest link intelligence database

ORGANIZATION
Majestic
FIRST SEEN
2006-01
RESPECTS ROBOTS.TXT
✓ YES
DOCUMENTATION
majestic.com
DAILY VISITS
COUNTRIES ACTIVE
TRACKING
STATUS
LAST SEEN

📡 MJ12BOT USER-AGENT STRING

Mozilla/5.0 (compatible; MJ12bot/v1.4.8; http://mj12bot.com/)

This is the User-Agent header sent by MJ12bot in HTTP requests. Use this to identify MJ12bot in your server access logs.

📋 ABOUT MJ12BOT

MJ12bot is the web crawler behind Majestic, one of the longest-running link intelligence platforms in the SEO industry. Majestic maintains one of the world's largest databases of web links, and MJ12bot continuously crawls the web to discover and catalog link relationships between websites.

MJ12bot focuses primarily on link discovery rather than content extraction. It follows links across pages to map the interconnections between websites, feeding Majestic's Trust Flow and Citation Flow metrics that are widely used in the SEO industry for evaluating website authority and backlink quality.

NORAD.io monitors MJ12bot as part of its SEO crawler tracking. While MJ12bot is a well-behaved crawler that respects robots.txt and Crawl-delay directives, its continuous crawling can accumulate significant traffic over time. NORAD provides visibility into MJ12bot activity to help site operators manage their SEO crawler allowances.

🎯 HOW TO DETECT MJ12BOT

  • User-Agent contains 'MJ12bot/v1.4'
  • Crawl patterns focus on link discovery — follows many links per page
  • Respects Crawl-delay directive in robots.txt
  • Full bot documentation at majestic.com/reports/majestic-bot
  • Lower content extraction focus compared to search engine crawlers

🔄 CRAWL BEHAVIOR

Continuous crawling focused on link discovery and mapping. Moderate request rates. Respects robots.txt and Crawl-delay. Focus on discovering new links rather than full content extraction.

PURPOSE

Builds Majestic's link intelligence database, which maps the link structure of the web. Used by SEO professionals for backlink analysis, trust flow metrics, and competitive intelligence.

🤖 ROBOTS.TXT CONFIGURATION

User-agent: MJ12bot
Crawl-delay: 10
Allow: /

# To block:
# User-agent: MJ12bot
# Disallow: /

MJ12bot respects robots.txt directives. Add this to your robots.txt file at the root of your domain.

🗺️ WHERE IS MJ12BOT ACTIVE?

⚠️ RELATED THREATS

📂 MORE 📊 SEO & DATA SCRAPERS

📚 RELATED GUIDES

PROTECT YOUR WEBSITE

Deploy SiteTrust to monitor and control AI bot access to your site with the Agent Passport Standard.

INSTALL SITETRUST →