Free Sitemap & Robots.txt
Checker Tool
Sitemap checker — find your XML sitemap, validate robots.txt rules, and check Google indexing status. Fix crawlability issues before they hurt your rankings.
Sitemap Checker
Find your website’s XML sitemap and count the total number of URLs it contains with our free sitemap checker.
Enter any domain and instantly locate its XML sitemap. Counts total URLs in the sitemap, checks if it’s properly formatted, and verifies it’s accessible to Googlebot. Also detects sitemap index files that reference multiple child sitemaps.
Sitemap Finder
Finds common sitemap locations and estimates URL count. Export to CSV or PDF.
SEO Sitemap Finder
Deep sitemap analysis — validate sitemap structure, check for errors, and verify all URLs are indexable.
Goes beyond finding the sitemap. Validates XML structure, checks lastmod dates, detects URLs blocked by robots.txt, identifies noindex pages included in the sitemap (a common SEO mistake), and verifies the sitemap is submitted to Google Search Console.
🗺️ Sitemap Finder
Robots.txt Viewer
View the full robots.txt file of any website and check which pages are blocked from crawling.
Fetches and displays the complete robots.txt file for any domain. Parses all Disallow and Allow rules for every user-agent, highlights potentially dangerous rules that block important pages, and shows the sitemap declaration within the file.
Robots.txt Viewer
Fetches robots.txt and shows content. Export to CSV or PDF.
Robots.txt Editor & Tester
Edit and test robots.txt rules — check if a specific URL is blocked or allowed before publishing changes.
Test any robots.txt rule against a specific URL to see if it would be blocked or allowed. Prevent costly mistakes — a single wrong Disallow rule can accidentally block your entire website from Google. Test before you publish any robots.txt changes.
🤖 Robots.txt Editor
Google Index Checker
Check whether your website pages are currently indexed by Google — find de-indexed URLs instantly.
Verify if any URL is currently indexed by Google. A page that isn’t indexed simply cannot rank — no matter how good its content is. Identify de-indexed pages caused by noindex tags, robots.txt blocks, or manual penalties, and fix them immediately.
🚀 Google Indexer Tool
Check Sitemap & Robots in 3 Steps
No account. No API key. Instant results from our free sitemap checker and robots.txt validator.
Enter Your Domain
Type your website URL into the sitemap or robots checker above.
Instant Analysis
Sitemap URLs, robots rules, and indexing status appear in seconds.
Fix & Resubmit
Fix crawl issues and resubmit your sitemap to Google Search Console.
Sitemap & Robots.txt Best Practices
Follow these rules to ensure Google crawls and indexes your site correctly. According to Google’s official sitemap documentation, submitting a sitemap is one of the most effective ways to ensure complete indexing of your pages.
Submit Sitemap to GSC
Always submit your sitemap.xml to Google Search Console. This ensures Google finds and indexes all your pages faster.
Never Block CSS/JS in Robots
Blocking CSS and JavaScript in robots.txt prevents Google from rendering your pages correctly, hurting your rankings.
Only Include Indexable Pages
Your sitemap should only list pages with index, follow. Never include noindex pages or URLs blocked by robots.txt.
Update lastmod Dates
Keep lastmod dates in your sitemap accurate. Google uses these to prioritize recrawling updated content.
Reference Sitemap in Robots.txt
Add Sitemap: https://yourdomain.com/sitemap.xml to your robots.txt so all crawlers can find it automatically.
Test Before Publishing
Always test robots.txt rule changes with the tester tool before going live. One wrong rule can block your whole site.
Continue Your Technical SEO Audit
Once your sitemap checker results are clean, use these tools for a complete technical audit.