Skip to content

Reduce Vercel edge requests by disabling Link prefetch and adding ISR caching#19

Open
tylercb wants to merge 5 commits intomainfrom
claude/review-project-improvements-l8V4I
Open

Reduce Vercel edge requests by disabling Link prefetch and adding ISR caching#19
tylercb wants to merge 5 commits intomainfrom
claude/review-project-improvements-l8V4I

Conversation

@tylercb
Copy link
Copy Markdown
Owner

@tylercb tylercb commented Mar 27, 2026

  • Disable Next.js Link prefetching on all internal links (RepoTable results,
    homepage examples, header). Each page with 100 results was generating 100+
    prefetch requests to Vercel's edge, massively inflating request counts.
  • Add ISR caching (revalidate=86400) to dynamic pages so rendered HTML is
    served from CDN cache for 24 hours instead of re-invoking functions.
  • Expand robots.txt to block AI crawlers and SEO bots (GPTBot, CCBot,
    ClaudeBot, SemrushBot, AhrefsBot, etc.) and restrict Googlebot/Bingbot
    to homepage only.

https://claude.ai/code/session_01LNGEccgWmmiFxDbfpc5zN4

claude added 2 commits March 27, 2026 02:16
… caching

- Disable Next.js Link prefetching on all internal links (RepoTable results,
  homepage examples, header). Each page with 100 results was generating 100+
  prefetch requests to Vercel's edge, massively inflating request counts.
- Add ISR caching (revalidate=86400) to dynamic pages so rendered HTML is
  served from CDN cache for 24 hours instead of re-invoking functions.
- Expand robots.txt to block AI crawlers and SEO bots (GPTBot, CCBot,
  ClaudeBot, SemrushBot, AhrefsBot, etc.) and restrict Googlebot/Bingbot
  to homepage only.

https://claude.ai/code/session_01LNGEccgWmmiFxDbfpc5zN4
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR aims to reduce Vercel edge request volume by preventing automatic client-side prefetch traffic, adding route-level ISR to cache rendered output, and discouraging crawler traffic via an expanded robots.txt.

Changes:

  • Disable Next.js <Link> prefetching on internal navigation links to prevent large prefetch fan-out from result lists.
  • Add export const revalidate = 86400 to dynamic routes to enable 24h ISR for rendered output.
  • Expand public/robots.txt to block various AI/SEO crawlers and limit major search engines’ crawl scope.

Reviewed changes

Copilot reviewed 6 out of 7 changed files in this pull request and generated 5 comments.

Show a summary per file
File Description
public/robots.txt Adds crawler blocks and new crawl restrictions; includes sitemap reference and revised allow/disallow rules.
app/page.tsx Disables prefetch on example internal links from the homepage.
app/components/RepoTable.tsx Disables prefetch on per-row internal “related repos” links to avoid N+1 prefetch requests.
app/components/Header.tsx Disables prefetch on the header home link.
app/[org]/[repo]/page.tsx Adds 24h ISR (revalidate) for repo result pages.
app/[...githubPath]/page.tsx Adds 24h ISR (revalidate) for redirecting GitHub path pages.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +68 to +69

Sitemap: https://gitrelated.com/sitemap.xml
Copy link

Copilot AI Mar 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sitemap: https://gitrelated.com/sitemap.xml points to a sitemap that doesn't exist in this repo (no public/sitemap.xml and no sitemap route). This is likely to create 404s for crawlers; either add/generate the sitemap at that path or remove/update the Sitemap URL.

Suggested change
Sitemap: https://gitrelated.com/sitemap.xml

Copilot uses AI. Check for mistakes.
Comment on lines 55 to +60
Allow: /
Disallow: /*/*

User-agent: Bingbot
Allow: /
Disallow: /*/*
Copy link

Copilot AI Mar 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Googlebot/Bingbot rules currently allow any single-segment path (e.g. /anything) because Allow: / combined with Disallow: /*/* only blocks paths containing a second /. If the intent is truly “homepage only”, tighten these rules (e.g., allow / only and disallow everything else, while explicitly allowing required assets like /robots.txt, /favicon.ico, and /_next/).

Suggested change
Allow: /
Disallow: /*/*
User-agent: Bingbot
Allow: /
Disallow: /*/*
Allow: /$
Allow: /robots.txt
Allow: /favicon.ico
Allow: /_next/
Disallow: /
User-agent: Bingbot
Allow: /$
Allow: /robots.txt
Allow: /favicon.ico
Allow: /_next/
Disallow: /

Copilot uses AI. Check for mistakes.
Comment on lines +62 to +64
# Default: allow homepage and static assets, block all dynamic repo pages
User-agent: *
Allow: /$
Copy link

Copilot AI Mar 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This block says it “allow[s] homepage and static assets”, but with Disallow: / it only allows /, /robots.txt, and /favicon.ico (and the /$ pattern isn’t supported by all crawlers). Either update the comment/intent, or add explicit Allow rules for the static asset paths/extensions you want indexed (e.g. /_next/, images, etc.).

Suggested change
# Default: allow homepage and static assets, block all dynamic repo pages
User-agent: *
Allow: /$
# Default: block all content for other user agents (except robots.txt and favicon)
User-agent: *

Copilot uses AI. Check for mistakes.
Comment on lines +4 to +5
// Cache rendered pages for 24 hours via ISR to reduce edge function invocations
export const revalidate = 86400;
Copy link

Copilot AI Mar 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider avoiding the magic number 86400 here (and in other routes) by defining a local ONE_DAY_SECONDS constant (or sharing an exported constant) to make the TTL self-documenting and reduce the chance of inconsistent values later.

Suggested change
// Cache rendered pages for 24 hours via ISR to reduce edge function invocations
export const revalidate = 86400;
const ONE_DAY_SECONDS = 60 * 60 * 24;
// Cache rendered pages for 24 hours via ISR to reduce edge function invocations
export const revalidate = ONE_DAY_SECONDS;

Copilot uses AI. Check for mistakes.
Comment on lines +5 to +7
// Cache redirect responses for 24 hours to reduce edge function invocations
export const revalidate = 86400;

Copy link

Copilot AI Mar 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider avoiding the magic number 86400 by defining a ONE_DAY_SECONDS constant (or reusing a shared constant) so the caching TTL is self-documenting and stays consistent across routes.

Copilot uses AI. Check for mistakes.
claude added 3 commits March 27, 2026 02:34
- Add @opennextjs/cloudflare adapter and wrangler for Cloudflare Workers deployment
- Create wrangler.toml with nodejs_compat and static asset config
- Create open-next.config.ts with Cloudflare config
- Update next.config.ts with initOpenNextCloudflareForDev() for local dev
- Add preview and deploy scripts to package.json
- Add .open-next and .wrangler to .gitignore

To deploy: npm run deploy
To preview locally: npm run preview

https://claude.ai/code/session_01LNGEccgWmmiFxDbfpc5zN4
The background bun install overwrote package.json, removing the Cloudflare
scripts. Restoring them and committing the updated bun.lockb.

https://claude.ai/code/session_01LNGEccgWmmiFxDbfpc5zN4
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants