next.js
2458d4ca - Buffer prefetch response before passing to Flight client

Commit
52 days ago
Buffer prefetch response before passing to Flight client Prefetch responses include metadata (in the Flight stream sense, not HTML document metadata) that describes properties of the overall response — things like the stale time and the set of params that were accessed during rendering. Conceptually these are like late HTTP headers: information that's only known once the response is complete. Since we can't rely on actual HTTP late headers being supported everywhere, we encode this metadata in the body of the Flight response. The mechanism works by including an unresolved thenable in the Flight payload, then resolving it just before closing the stream. On the client, after the stream is fully received, we unwrap the thenable synchronously. This synchronous unwrap relies on the assumption that the server resolved the thenable before closing the stream. The server already buffers prefetch responses before sending them, so the resolved thenable data is always present in the response. However, HTTP chunking in the browser layer can introduce taskiness when processing the response, which could prevent Flight from decoding the full payload synchronously. The existing code includes fallback behavior for this case (e.g. treating the vary params as unknown), so this doesn't fix a semantic issue — it strengthens the guarantee so that the fallback path is never reached. To do this, we buffer the full response on the client and concatenate it into a single chunk before passing it to Flight. A single chunk is necessary because Flight's processBinaryChunk processes all rows synchronously within one call. Multiple chunks would not be sufficient even if pre-enqueued: the `await` continuation from createFromReadableStream can interleave between chunks, causing promise value rows to be processed after the root model initializes, which leaves thenables in a pending state. Since the server already buffers these responses and they complete during a prefetch (not during a navigation), this is not a performance consideration. Full (dynamic) prefetches are not affected by this change. These are streaming responses — even though they are cached, they are a special case where dynamic data is treated as if it were cached. They don't need to be buffered on either the server or the client the way normal cached responses are.
Author
Committer
Parents
Loading