How to Create an SEO-Friendly Sitemap in Next.js
Search engines rely on clear signals to discover, crawl, and index your website correctly. One of the most important — yet often overlooked — SEO files is your sitemap. In modern Next.js projects, especially when using the App Router, generating a dynamic sitemap is both powerful and essential.
In this guide, we’ll walk through how to create a dynamic sitemap in Next.js using a sitemap.js file, include static and blog routes, and properly connect it with a robots.txt file for optimal SEO performance.
Why a Sitemap Matters for SEO
A sitemap is a structured file that tells search engines which pages exist on your website, how often they change, and how important they are. While search engines can discover pages through internal links, a sitemap ensures nothing important is missed.
- Improves crawl efficiency
- Helps search engines find new content faster
- Provides metadata like last updated dates
- Essential for dynamic routes such as blog posts
Next.js Sitemap with the App Router
Next.js allows you to generate a sitemap by exporting a function from a sitemap.js file inside the app directory. This function runs on the server and returns an array of URL objects that Next.js automatically converts into an XML sitemap.
This approach is clean, scalable, and ideal for blogs, portfolios, and content-driven websites.
Dynamic Sitemap Example (Static + Blog Routes)
1import { blogPosts } from "../../lib/data";23export default async function sitemap() {4const baseUrl = "https://fergellmurphy.co.za/";56// --- Static pages ---7const staticPages = ["", "blog", "contact"].map((route) => ({8url: `${baseUrl}${route}`,9lastModified: new Date().toISOString(),10changeFrequency: "weekly",11priority: 1,12}));1314// --- Dynamic blog routes ---15const blogRoutes = blogPosts.map((post) => ({16url: `${baseUrl}blog/${post.slug}`,17lastModified: post.updatedAt || new Date().toISOString(),18changeFrequency: "weekly",19priority: 1,20}));2122return [...staticPages, ...blogRoutes];23}
How This Code Works
This sitemap pulls data directly from your blog content and generates individual URLs for each post automatically. Whenever you add or update a blog article, your sitemap updates without manual changes.
- baseUrl defines your canonical website URL.
- staticPages includes routes like the homepage, contact page and blog index.
- blogRoutes dynamically maps blog slugs into SEO crawlable URLs.
- lastModified helps search engines re-crawl updated content.
- priority signals relative importance.
Where Your Sitemap Is Generated
Once deployed, Next.js automatically exposes this sitemap at:
https://fergellmurphy.co.za/sitemap.xml
You can submit this URL to Google Search Console to speed up indexing and monitor crawl performance.
Adding a robots.txt File
Your robots.txt file tells search engines how they may crawl your site and where to find your sitemap.
1User-agent: *2Allow: /34Sitemap: https://fergellmurphy.co.za/sitemap.xml
Why robots.txt and Sitemap Work Together
While a sitemap lists your pages, the robots file acts as the entry point for crawlers. Together, they ensure search engines efficiently discover, crawl, and index your most important content.
Final Thoughts
Creating a dynamic sitemap in Next.js is one of the simplest ways to improve your website’s SEO foundation. When paired with a properly configured robots.txt file, this setup is scalable, future-proof, and fully AdSense-ready.