How to Generate an XML Sitemap
Create a properly formatted XML sitemap for your website with our free Sitemap Generator. Help search engines discover and crawl all your pages.
Steps
Enter your website URLs
Add all the pages you want search engines to discover and index. Include your most important pages: homepage, category pages, individual posts and articles, product pages, and key landing pages. Do not include pages you have blocked with noindex tags or robots.txt disallow rules — submitting blocked pages wastes crawl budget.
Set priority for each URL
Priority values range from 0.0 to 1.0. The homepage is typically 1.0, category pages 0.8, individual content pages 0.6–0.7, and less important pages 0.3–0.5. Priority is relative within your site only and does not influence how Google ranks pages against other sites. Most SEO tools recommend against over-specifying priority, as Google largely ignores it.
Set change frequency
Indicate how often each page is likely to change: always, hourly, daily, weekly, monthly, yearly, or never. The homepage and blog index change frequently (daily or weekly); static pages like About and Contact rarely change (monthly or yearly). Like priority, Google uses this as a hint rather than a firm instruction.
Set last modified dates
Add the lastmod date for each URL in ISO 8601 format (YYYY-MM-DD). This is the most useful field for search engines — it tells them when content was last updated so they can prioritise recrawling recently changed pages. Keep lastmod values accurate; artificially inflating them to trigger recrawling can result in Google ignoring the field entirely.
Download and submit
Download the generated sitemap.xml file and upload it to the root of your domain (yoursite.com/sitemap.xml). Submit the URL to Google Search Console under Sitemaps and to Bing Webmaster Tools. Also reference it in your robots.txt file: Sitemap: https://yoursite.com/sitemap.xml
XML Sitemaps vs HTML Sitemaps
XML sitemaps are for search engines — they are machine-readable files designed to efficiently communicate your site structure to crawlers. HTML sitemaps are for human visitors — they are web pages listing links to all or key sections of a site to help users navigate. Both serve valid purposes but are entirely separate files. XML sitemaps have largely replaced HTML sitemaps in importance for SEO, but some sites still maintain both: the XML sitemap for search engine crawling and an HTML sitemap page for visitors who cannot find what they are looking for through normal navigation. Dynamic, regularly crawled sites benefit most from XML sitemaps; static sites with good internal linking benefit less.
Sitemap Best Practices for Dynamic Websites
Manual sitemap generation (as this tool helps with) works well for small to medium static sites. For dynamic websites where content changes frequently — blogs, e-commerce stores, news sites — automated sitemap generation is essential. Most CMS platforms offer sitemap plugins: Yoast SEO and Rank Math for WordPress, built-in sitemaps in Shopify, and sitemap packages for frameworks like Next.js and Gatsby. For custom applications, generate the sitemap dynamically on the server using your database of published content and expose it at /sitemap.xml. Update the sitemap whenever content is published, updated, or deleted. Consider using a sitemap index with separate sitemaps by content type (posts, products, pages) to make it easier to diagnose indexing issues by category.
Frequently Asked Questions
A single XML sitemap file can contain a maximum of 50,000 URLs and must not exceed 50MB uncompressed. For larger sites, use a sitemap index file that references multiple individual sitemaps. Most SEO platforms and CMS plugins (Yoast, Rank Math, WordPress SEO by Yoast) generate sitemap indexes automatically for large sites.
For very small websites (under 20 pages) with good internal linking, a sitemap is not strictly necessary — Google can discover all pages by following links. However, a sitemap is still recommended for any website because it guarantees discoverability, makes it easier to see which URLs are indexed in Search Console, and lets you communicate page priorities and update frequencies. There is no downside to having one.
Only include pages you want indexed. Exclude: pages with noindex meta tags, admin and login pages, thank-you pages after form submissions, paginated pages beyond page 2 (generally), pages with very thin content, and pages blocked in robots.txt. A lean, high-quality sitemap helps direct crawl budget to your best content.