@fergellmurphy01

How to Create an SEO-Friendly Sitemap in Next.js

Search engines rely on clear signals to discover, crawl, and index your website correctly. One of the most important — yet often overlooked — SEO files is your sitemap. In modern Next.js projects, especially when using the App Router, generating a dynamic sitemap is both powerful and essential.

In this guide, we’ll walk through how to create a dynamic sitemap in Next.js using a sitemap.js file, include static and blog routes, and properly connect it with a robots.txt file for optimal SEO performance.

Why a Sitemap Matters for SEO

A sitemap is a structured file that tells search engines which pages exist on your website, how often they change, and how important they are. While search engines can discover pages through internal links, a sitemap ensures nothing important is missed.

  1. Improves crawl efficiency
  2. Helps search engines find new content faster
  3. Provides metadata like last updated dates
  4. Essential for dynamic routes such as blog posts

Next.js Sitemap with the App Router

Next.js allows you to generate a sitemap by exporting a function from a sitemap.js file inside the app directory. This function runs on the server and returns an array of URL objects that Next.js automatically converts into an XML sitemap.

This approach is clean, scalable, and ideal for blogs, portfolios, and content-driven websites.

Dynamic Sitemap Example (Static + Blog Routes)

1
import { blogPosts } from "../../lib/data";
2
3
export default async function sitemap() {
4
const baseUrl = "https://fergellmurphy.co.za/";
5
6
// --- Static pages ---
7
const staticPages = ["", "blog", "contact"].map((route) => ({
8
url: `${baseUrl}${route}`,
9
lastModified: new Date().toISOString(),
10
changeFrequency: "weekly",
11
priority: 1,
12
}));
13
14
// --- Dynamic blog routes ---
15
const blogRoutes = blogPosts.map((post) => ({
16
url: `${baseUrl}blog/${post.slug}`,
17
lastModified: post.updatedAt || new Date().toISOString(),
18
changeFrequency: "weekly",
19
priority: 1,
20
}));
21
22
return [...staticPages, ...blogRoutes];
23
}

How This Code Works

This sitemap pulls data directly from your blog content and generates individual URLs for each post automatically. Whenever you add or update a blog article, your sitemap updates without manual changes.

  1. baseUrl defines your canonical website URL.
  2. staticPages includes routes like the homepage, contact page and blog index.
  3. blogRoutes dynamically maps blog slugs into SEO crawlable URLs.
  4. lastModified helps search engines re-crawl updated content.
  5. priority signals relative importance.

Where Your Sitemap Is Generated

Once deployed, Next.js automatically exposes this sitemap at:

https://fergellmurphy.co.za/sitemap.xml

You can submit this URL to Google Search Console to speed up indexing and monitor crawl performance.

Adding a robots.txt File

Your robots.txt file tells search engines how they may crawl your site and where to find your sitemap.

1
User-agent: *
2
Allow: /
3
4
Sitemap: https://fergellmurphy.co.za/sitemap.xml

Why robots.txt and Sitemap Work Together

While a sitemap lists your pages, the robots file acts as the entry point for crawlers. Together, they ensure search engines efficiently discover, crawl, and index your most important content.

Final Thoughts

Creating a dynamic sitemap in Next.js is one of the simplest ways to improve your website’s SEO foundation. When paired with a properly configured robots.txt file, this setup is scalable, future-proof, and fully AdSense-ready.