[GH-ISSUE #204] Systematic audit of Next.js test suite (file-by-file) #51

Open
opened 2026-05-06 12:36:50 +02:00 by BreizhHardware · 2 comments

Originally created by @southpolesteve on GitHub (Feb 28, 2026).
Original GitHub issue: https://github.com/cloudflare/vinext/issues/204

Problem

Our Next.js compatibility test tracking (tests/nextjs-compat/TRACKING.md) was built using a feature-first methodology: identify features vinext implements, then find relevant Next.js tests. This missed edge cases and error handling tests that don't map neatly to a "feature."

Example: test/e2e/app-dir/proxy-missing-export/ tests that Next.js throws an error when a proxy/middleware file doesn't export the expected function. Our middleware implementation silently failed open instead, letting requests through unprotected. This was never caught because middleware was already "covered" by other tests (ON-6, ON-11), and the gap analysis never opened this specific test directory.

The root cause is that TRACKING.md was built by asking "what features do we have, and do we have tests?" instead of "what does Next.js test, and do we match?"

Fixed in #203, but we need a systematic audit to find other gaps like this.

Proposal

Do a file-by-file walk through every test directory in the Next.js repo's test suite. For each directory:

  1. Read what the test covers
  2. Determine if it's relevant to vinext (skip build-only, Turbopack-specific, Vercel-deploy-specific)
  3. If relevant, check whether we have equivalent coverage
  4. If not, either port the test or document why it's not applicable

Directories to audit

  • test/e2e/app-dir/ (365+ directories, partially covered by TRACKING.md)
  • test/e2e/ top-level (middleware, pages router, config, etc.)
  • test/unit/ (pure function tests for routing, matching, etc.)

Methodology

For each directory, record in a tracking document:

  • Directory name and what it tests
  • Relevance to vinext (yes/no/partial)
  • Current vinext coverage (covered/missing/partial)
  • Action needed (port tests/skip/N/A)

Priority ordering

Focus first on:

  1. Error handling and validation tests (like proxy-missing-export) since these are the ones most likely to be missed by feature-first analysis and most dangerous when missing
  2. Edge cases for already-implemented features
  3. New feature areas not yet covered

Context

Originally created by @southpolesteve on GitHub (Feb 28, 2026). Original GitHub issue: https://github.com/cloudflare/vinext/issues/204 ## Problem Our Next.js compatibility test tracking (`tests/nextjs-compat/TRACKING.md`) was built using a **feature-first** methodology: identify features vinext implements, then find relevant Next.js tests. This missed edge cases and error handling tests that don't map neatly to a "feature." **Example:** `test/e2e/app-dir/proxy-missing-export/` tests that Next.js throws an error when a proxy/middleware file doesn't export the expected function. Our middleware implementation silently failed open instead, letting requests through unprotected. This was never caught because middleware was already "covered" by other tests (ON-6, ON-11), and the gap analysis never opened this specific test directory. The root cause is that TRACKING.md was built by asking "what features do we have, and do we have tests?" instead of "what does Next.js test, and do we match?" Fixed in #203, but we need a systematic audit to find other gaps like this. ## Proposal Do a **file-by-file walk** through every test directory in the Next.js repo's test suite. For each directory: 1. Read what the test covers 2. Determine if it's relevant to vinext (skip build-only, Turbopack-specific, Vercel-deploy-specific) 3. If relevant, check whether we have equivalent coverage 4. If not, either port the test or document why it's not applicable ### Directories to audit - `test/e2e/app-dir/` (365+ directories, partially covered by TRACKING.md) - `test/e2e/` top-level (middleware, pages router, config, etc.) - `test/unit/` (pure function tests for routing, matching, etc.) ### Methodology For each directory, record in a tracking document: - Directory name and what it tests - Relevance to vinext (yes/no/partial) - Current vinext coverage (covered/missing/partial) - Action needed (port tests/skip/N/A) ### Priority ordering Focus first on: 1. **Error handling and validation** tests (like `proxy-missing-export`) since these are the ones most likely to be missed by feature-first analysis and most dangerous when missing 2. **Edge cases** for already-implemented features 3. **New feature areas** not yet covered ## Context - Current TRACKING.md: `tests/nextjs-compat/TRACKING.md` - The proxy-missing-export fix: #203 - Next.js test suite: https://github.com/vercel/next.js/tree/canary/test/e2e/app-dir
Author
Owner

@southpolesteve commented on GitHub (Feb 28, 2026):

This is a job for ralph wiggum

<!-- gh-comment-id:3978543898 --> @southpolesteve commented on GitHub (Feb 28, 2026): This is a job for ralph wiggum
Author
Owner

@Divkix commented on GitHub (Mar 11, 2026):

Related to #454. While this issue audits test coverage, #454 is building a machine-readable API manifest to track which exports vinext implements. Both aim to find gaps but from different angles - this from tests, that from the API surface. Not blocking.

<!-- gh-comment-id:4042263274 --> @Divkix commented on GitHub (Mar 11, 2026): Related to #454. While this issue audits test coverage, #454 is building a machine-readable API manifest to track which exports vinext implements. Both aim to find gaps but from different angles - this from tests, that from the API surface. Not blocking.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/vinext#51
No description provided.