Skip to content

feat: implement cacheForRequest() per-request factory cache#646

Open
JamesbbBriz wants to merge 1 commit intocloudflare:mainfrom
JamesbbBriz:feat/cache-for-request
Open

feat: implement cacheForRequest() per-request factory cache#646
JamesbbBriz wants to merge 1 commit intocloudflare:mainfrom
JamesbbBriz:feat/cache-for-request

Conversation

@JamesbbBriz
Copy link

Summary

Implements cacheForRequest() from RFC #623 — a per-request factory cache that lazily initializes values on first call and returns the cached result for subsequent calls within the same request.

API

import { cacheForRequest } from "vinext/cache-for-request";

const getPrisma = cacheForRequest(() => {
  const pool = new Pool({ connectionString: env.HYPERDRIVE.connectionString });
  return new PrismaClient({ adapter: new PrismaPg(pool) });
});

// In any server code:
const prisma = getPrisma(); // first call creates, subsequent calls reuse

Key design decisions:

  • Factory function reference = cache key (no string keys, no cross-module collision)
  • Async factories cache the Promise itself (concurrent-safe: two await getDb() share one invocation)
  • Outside request scope: factory runs every time, no caching (safe for tests/build-time)

Changes

File Change
shims/cache-for-request.ts New — cacheForRequest() implementation (17 lines of logic)
shims/unified-request-context.ts Add requestCache: WeakMap<Function, unknown> field + shallow-clone comment sync
package.json Add "./cache-for-request" export

Blast radius

Zero. New file + one new field with new WeakMap() default (~0.1μs per request). No existing behavior changes.

Test plan

Tested with Prisma v7 + Hyperdrive on Workers. Happy to add unit tests — let me know the preferred test file location.

Addresses #623 · Relates to #537

Implements the cacheForRequest API discussed in cloudflare#623.
Factory function reference is the cache key — no string keys, no
collision risk between modules. Async factories cache the Promise
itself, so concurrent calls share one invocation. Outside request
scope, the factory runs every time with no caching.
Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: f2f1d15111

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +55 to +57
if (!isInsideUnifiedScope()) {
// Outside request scope: no caching, run factory every time.
return factory();

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Cache in execution-context request scopes

cacheForRequest() disables caching whenever isInsideUnifiedScope() is false, but Pages Router middleware runs in a request scope without unified ALS (runMiddleware() only wraps _runWithExecutionContext, and prod server currently calls runMiddleware(webRequest, undefined)). In that path, repeated calls within a single request always re-run the factory, so this API does not provide the advertised per-request behavior for middleware/server code on Pages Router production flows.

Useful? React with 👍 / 👎.

@JamesbbBriz
Copy link
Author

Re: middleware scope — this is intentional. Caching without a proper request scope would risk leaking state across requests, which is worse than re-running the factory. Pages Router middleware is the only path without unified scope; wrapping it would be a separate middleware pipeline change. The fallback (factory runs every time) is safe and predictable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant