Skip to content

Conversation

@H01001000
Copy link

@H01001000 H01001000 commented Nov 19, 2025

This PR adds experimental support for Next 16 cache component cache handlers.

To do so:

  • The mergeHandler creation is separated out as a reusable function
  • Add CacheHandlersValue to represent/encapsulate cache component's CacheEntry type in CacheHandlerValue structure
  • Add generic type for a lot of functions (until ts don't complain), and allow using new CacheHandlersValue for the type

This, for now allows no interface change for cache-handler user. While to use cache-handlers, user need to cast handler for CacheHandlersValue type

Haven't in-depth test a lot, but it works

Example usage

// cache-handlers.ts

import type { CacheHandlersValue } from "@fortedigital/nextjs-cache-handler/cache-handler.types";
import { createCacheHandlers } from "@fortedigital/nextjs-cache-handler/cache-handlers";
import createLruHandler from "@fortedigital/nextjs-cache-handler/local-lru";
import createRedisHandler from "@fortedigital/nextjs-cache-handler/redis-strings";
import { createClient } from "redis";

const settings = {
  // ...
};

const redisClient = createClient(settings);

try {
  console.info("Connecting Redis client...");
  await redisClient.connect();
  console.info("Redis client connected.");
} catch (error) {
  console.warn("Failed to connect Redis client:", error);
  await redisClient
    .disconnect()
    .catch(() =>
      console.warn("Failed to quit the Redis client after failing to connect."),
    );
}

const lruCache = createLruHandler();
const redisCacheHandler = createRedisHandler<CacheHandlersValue>({
  // @ts-ignore
  client: redisClient,
  keyPrefix: "nextjs:",
});

export default await createCacheHandlers({
  handlers: [redisCacheHandler],
});

@H01001000 H01001000 changed the title feat: add temp cache-handlers support feat: add experimental cache-handlers support Nov 19, 2025
const handlersList: Handler<CacheHandlersValue>[] = config.handlers.filter(
(handler) => !!handler,
);
const memoryCache = createMergedHandler(handlersList);
Copy link
Collaborator

@AyronK AyronK Nov 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why memoryCache as this is merged from multiple handlers, possibly redis as well?

I think this implementation is quite naive in a sense that it only "makes it work" without addressing the actual implementation of the new interface.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Opps, i forgot rename that. I tho was using it with redis and LRU

@AyronK
Copy link
Collaborator

AyronK commented Nov 19, 2025

Thank you for your contribution @H01001000. However I'll keep it on hold as a reference for now. I need a better overview of what needs to be done and I don't think wrapping handlers with a decorator that does this:

    async refreshTags() {
      // Nothing to do for an in-memory cache handler.
    },

    async getExpiration() {
      return Infinity;
    },

is a way to go.

Nevertheless, it's quite a bit of exploration work already done, thank you for that!

@H01001000
Copy link
Author

H01001000 commented Nov 19, 2025

Yea, totally.

for refreshTags, getExpiration. I think when Vercel created cachehandlers, they have two tag invalidation patterns in mind.

  1. (same as cachehandler) tags are stored/handled along the cache. e.g. getting from redis = value + tags, which "most" tag invalidate should done when getting from cache
  2. tags are stored/handled separately than the cache. e.g. get value from redis, update local tag registry from "tag service" (hense refreshTags), which mean it invalidate tags on each Next.js instance.

A bit wordy but hope you understand, tho its just my interpretation of their docs

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants