Sitemap & Robots Generation
Generate sitemap.xml and robots.txt automatically from your route config — and auto-build breadcrumbs from any URL path.
sitemap.xml Generation
Pass your route list and a base URL — the library outputs a valid sitemap ready to serve or write to disk. Wildcard patterns exclude private routes automatically.
import { generateSitemap } from "react-ssr-seo-toolkit/sitemap";
export const sitemap = generateSitemap({
baseUrl: "https://trustix.uk",
routes: [
{ path: "/", priority: 1.0, changefreq: "daily" },
{ path: "/tickets", priority: 0.9, changefreq: "hourly" },
{ path: "/tickets/liverpool-vs-arsenal", priority: 0.8 },
{ path: "/about", priority: 0.5, changefreq: "monthly"},
"/blog",
"/contact",
],
exclude: ["/dashboard/*", "/admin/*"],
defaultChangefreq: "weekly",
defaultPriority: 0.6,
});// app/sitemap.xml/route.ts
import { sitemap } from "@/lib/sitemap";
export function GET() {
return new Response(sitemap, {
headers: { "Content-Type": "application/xml; charset=utf-8" },
});
}// app/routes/sitemap[.xml].tsx
import { sitemap } from "~/lib/sitemap";
import type { LoaderFunction } from "react-router";
export const loader: LoaderFunction = () =>
new Response(sitemap, {
headers: { "Content-Type": "application/xml; charset=utf-8" },
});// server.ts
import { sitemap } from "./lib/sitemap";
app.get("/sitemap.xml", (_req, res) =>
res.type("application/xml").send(sitemap)
);
// Or build-time: write to disk
// import { writeFileSync } from "fs";
// writeFileSync("public/sitemap.xml", sitemap);<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://trustix.uk</loc>
<lastmod>2026-05-08</lastmod>
<changefreq>daily</changefreq>
<priority>1.0</priority>
</url>
<url>
<loc>https://trustix.uk/tickets</loc>
<lastmod>2026-05-08</lastmod>
<changefreq>hourly</changefreq>
<priority>0.9</priority>
</url>
<url>
<loc>https://trustix.uk/tickets/liverpool-vs-arsenal</loc>
<lastmod>2026-05-08</lastmod>
<changefreq>weekly</changefreq>
<priority>0.8</priority>
</url>
<url>
<loc>https://trustix.uk/about</loc>
<lastmod>2026-05-08</lastmod>
<changefreq>monthly</changefreq>
<priority>0.5</priority>
</url>
<url>
<loc>https://trustix.uk/blog</loc>
<lastmod>2026-05-08</lastmod>
<changefreq>daily</changefreq>
<priority>0.7</priority>
</url>
<url>
<loc>https://trustix.uk/contact</loc>
<lastmod>2026-05-08</lastmod>
<changefreq>weekly</changefreq>
<priority>0.6</priority>
</url>
<url>
<loc>https://trustix.uk/faq</loc>
<lastmod>2026-05-08</lastmod>
<changefreq>weekly</changefreq>
<priority>0.6</priority>
</url>
</urlset>robots.txt Generation
Define per-agent rules, disallow private paths, and include your sitemap URL — all from a typed config object.
import { generateRobots } from "react-ssr-seo-toolkit/sitemap";
export const robots = generateRobots({
rules: [
{
userAgent: "*",
allow: "/",
disallow: ["/dashboard", "/admin", "/login"],
},
{
userAgent: "Googlebot",
allow: "/",
crawlDelay: 2,
},
],
sitemap: "https://trustix.uk/sitemap.xml",
});// app/robots.txt/route.ts
import { robots } from "@/lib/robots";
export function GET() {
return new Response(robots, {
headers: { "Content-Type": "text/plain; charset=utf-8" },
});
}// app/routes/robots[.txt].tsx
import { robots } from "~/lib/robots";
import type { LoaderFunction } from "react-router";
export const loader: LoaderFunction = () =>
new Response(robots, {
headers: { "Content-Type": "text/plain; charset=utf-8" },
});// server.ts
import { robots } from "./lib/robots";
app.get("/robots.txt", (_req, res) =>
res.type("text/plain").send(robots)
);User-agent: * Allow: / Disallow: /dashboard Disallow: /admin Disallow: /login User-agent: Googlebot Allow: / Crawl-delay: 2 Sitemap: https://trustix.uk/sitemap.xml
autoBreadcrumb — URL to BreadcrumbList
Pass any URL path and get a typed BreadcrumbItem[] back — hyphens become spaces, each segment is capitalized. Feed the result straight into createBreadcrumbSchema().
Input path: /ticket/liverpool-vs-arsenal
import { autoBreadcrumb } from "react-ssr-seo-toolkit/sitemap";
import { createBreadcrumbSchema } from "react-ssr-seo-toolkit";
const items = autoBreadcrumb("/ticket/liverpool-vs-arsenal", {
baseUrl: "https://trustix.uk",
labels: { "/ticket": "Tickets" }, // optional label overrides
});
// Result:
// [
// { name: "Home", url: "https://trustix.uk/" },
// { name: "Tickets", url: "https://trustix.uk/ticket" },
// { name: "Liverpool Vs Arsenal", url: "https://trustix.uk/ticket/liverpool-vs-arsenal" },
// ]
// Feed directly into JSON-LD schema:
const schema = createBreadcrumbSchema(items);{
"@context": "https://schema.org",
"@type": "BreadcrumbList",
"itemListElement": [
{
"@type": "ListItem",
"position": 1,
"name": "Home",
"item": "https://trustix.uk/"
},
{
"@type": "ListItem",
"position": 2,
"name": "Tickets",
"item": "https://trustix.uk/ticket"
},
{
"@type": "ListItem",
"position": 3,
"name": "Liverpool Vs Arsenal",
"item": "https://trustix.uk/ticket/liverpool-vs-arsenal"
}
]
}Features Demonstrated
generateSitemap
Accepts string[] or SitemapRoute[]. Supports per-route priority, changefreq, and lastmod. Wildcard exclude patterns.
generateRobots
Per-agent rules with allow/disallow arrays. Crawl-delay support. Multiple sitemap URLs.
autoBreadcrumb
Converts URL segments to readable labels. Custom label overrides per path. Custom format function for advanced cases.
Zero Config
All functions work with sensible defaults. No setup, no schema registration — just call and use.
Right-click → View Page Source to see the BreadcrumbList JSON-LD injected by autoBreadcrumb + createBreadcrumbSchema for this page.