JSON Streaming: How to Display Results 10x Earlier
8 min read

JSON Streaming: How to Display Results 10x Earlier

How I dramatically reduced perceived latency from 971ms to 79ms on my domain search tool by implementing streaming JSON responses.

by Jonathan Beurel

When I was building a domain availability search tool, I quickly encountered a fundamental problem: checking 40+ domain extensions in parallel was surprisingly slow.

I decided to dig deeper to understand where this slowness was coming from.

I'll show you how JSON Streaming solved this problem and reduced the time to first result from 971ms to 79ms — a 10x improvement in perceived performance.

The Problem: Parallel Requests, Sequential Display

Here's what a typical domain availability check looks like:

// Traditional approach: wait for all results
const results = await Promise.all(domains.map((domain) => checkAvailability(domain)));
 
// Only now can we return results
return NextResponse.json({ results });

I added server-side logs to see why the request was so long. Was the availability check uniform across all extensions? And I discovered that there was a great deal of variability in the response times.

[Domain Check] Starting check for: "startup"
   Extensions: 40
   ⏱️  startup.io    79ms Taken
   ⏱️  startup.de    92ms Taken
   ⏱️  startup.jp    91ms Taken
   ⏱️  startup.us    91ms Taken
   ⏱️  startup.au    91ms Available
   ⏱️  startup.co   119ms Taken
   ...
   ⏱️  startup.company   780ms Taken
   ⏱️  startup.news   780ms Available
   ⏱️  startup.group   784ms Taken
   ⏱️  startup.media   785ms Taken
   ⏱️  startup.agency   784ms Taken
   ⏱️  startup.today   783ms Taken
   ⏱️  startup.studio   790ms Taken
   ⏱️  startup.world   789ms Taken
   ⏱️  startup.in   804ms Taken
   ⏱️  startup.br   971ms Available

The user waits 1 second for the first result while most results were already available almost immediately.

Why make the user wait? Couldn't we give them value much earlier?

The Solution: JSON Streaming Responses

Instead of batching results into a single response, we stream each result as it becomes available using NDJSON (Newline Delimited JSON):

{"domain":"awesome.us","available":true,"tld":"us"}
{"domain":"awesome.de","available":true,"tld":"de"}
{"domain":"awesome.au","available":true,"tld":"au"}
...

Each line is a complete JSON object, and the browser can parse and display them progressively.

Benefits Over Traditional JSON

Performance:

  • First result visible in ~79ms instead of 971ms
  • No single slow request blocks the entire UI
  • Single HTTP connection with progressive data transfer

User Experience:

  • Progressive loading creates perception of speed
  • Users can start interacting with results immediately
  • Particularly effective on mobile/slow networks

Simplicity:

  • Standard HTTP features (chunked transfer encoding)
  • No WebSockets or SSE complexity
  • Works with any HTTP client

Implementation: Server-Side (Next.js)

Here's how to implement streaming in a Next.js API route:

// app/api/check-domain/route.ts
export async function GET(request: NextRequest) {
  const query = request.nextUrl.searchParams.get('q');
  const encoder = new TextEncoder();
 
  // Create a streaming response
  const stream = new ReadableStream({
    async start(controller) {
      const domainsToCheck = EXTENSIONS.map((ext) => ({
        domain: `${query}.${ext}`,
        tld: ext,
      }));
 
      // Start all checks in parallel
      const allChecks = domainsToCheck.map(async ({ domain, tld }) => {
        const available = await checkDomainAvailability(domain);
 
        const result = { domain, available, tld };
 
        // Stream each result immediately
        const line = JSON.stringify(result) + '\n';
        controller.enqueue(encoder.encode(line));
 
        return result;
      });
 
      // Wait for all checks to complete
      await Promise.all(allChecks);
 
      controller.close();
    },
  });
 
  return new Response(stream, {
    headers: {
      'Content-Type': 'application/x-ndjson',
      'Cache-Control': 'no-cache',
      Connection: 'keep-alive',
    },
  });
}

Key Points

ReadableStream API: The modern Web Streams API provides native browser support for streaming.

TextEncoder: Convert strings to bytes for streaming over HTTP.

NDJSON Content-Type: The application/x-ndjson MIME type signals line-delimited JSON.

controller.enqueue(): Push data chunks to the stream as they become available.

controller.close(): Signal the end of the stream once all results are sent.

Implementation: Client-Side (React)

Consuming a streaming response requires reading the stream chunk by chunk:

const searchDomains = async (query: string) => {
  const response = await fetch(`/api/check-domain?q=${query}`);
 
  if (!response.body) {
    throw new Error('Response body is null');
  }
 
  const reader = response.body.getReader();
  const decoder = new TextDecoder();
  let buffer = '';
 
  // Clear results before streaming
  setResults([]);
 
  while (true) {
    const { done, value } = await reader.read();
 
    if (done) break;
 
    // Decode chunk and add to buffer
    buffer += decoder.decode(value, { stream: true });
 
    // Split by newlines to get complete JSON objects
    const lines = buffer.split('\n');
 
    // Keep incomplete line in buffer
    buffer = lines.pop() || '';
 
    // Parse and display each complete line
    for (const line of lines) {
      if (line.trim()) {
        const result = JSON.parse(line);
        setResults((prev) => [...prev, result]);
      }
    }
  }
};

How It Works

response.body.getReader(): Get a ReadableStream reader for chunk-by-chunk processing.

TextDecoder: Convert byte chunks back to strings.

Buffer management: Handle partial lines that span multiple chunks.

Progressive updates: Add each result to the UI as soon as it's parsed.

Performance Impact

The performance improvements are dramatic:

Before Streaming:

  • Time to first result: 971ms (worst case)
  • User perception: Slow, unresponsive
  • Cache hit rate: Standard JSON caching

After Streaming:

  • Time to first result: 79ms (best case)
  • User perception: Fast, responsive
  • Cache hit rate: Same (per-result caching)

Why This Matters

Mobile Networks: On 3G/4G connections, the difference between 79ms and 971ms is the difference between "instant" and "slow."

User Engagement: Users who see results immediately are more likely to interact with your tool.

Competitive Advantage: Tools like InstantDomainSearch built their reputation on speed — streaming is a key enabler.

Practical Tips and Best Practices

1. Use NDJSON for Streaming

The application/x-ndjson (or application/jsonlines) MIME type is standard for line-delimited JSON:

headers: {
  'Content-Type': 'application/x-ndjson',
}

2. Handle Partial Chunks Correctly

Network packets can split JSON objects mid-line. Always buffer incomplete lines:

const lines = buffer.split('\n');
buffer = lines.pop() || ''; // Keep last (incomplete) line

3. Consider Result Ordering

By default, results arrive in completion order. For alphabetical display, sort on the client:

const sortedResults = [...results].sort((a, b) => a.tld.localeCompare(b.tld));

4. Add Timeout Protection

Slow checks shouldn't block forever:

const checkWithTimeout = async (domain: string) => {
  const controller = new AbortController();
  const timeout = setTimeout(() => controller.abort(), 3000);
 
  try {
    return await fetch(rdapUrl, { signal: controller.signal });
  } finally {
    clearTimeout(timeout);
  }
};

Alternatives and Trade-offs

Server-Sent Events (SSE)

SSE provides a higher-level abstraction but adds complexity:

// SSE approach
const eventSource = new EventSource('/api/check-domain?q=example');
 
eventSource.onmessage = (event) => {
  const result = JSON.parse(event.data);
  setResults((prev) => [...prev, result]);
};

Pros: Built-in reconnection, event types

Cons: More overhead, one-way only, no HTTP/2 multiplexing

WebSockets

Full-duplex communication, but overkill for one-way streaming:

Pros: Bidirectional, low latency

Cons: Complex infrastructure, stateful connections

JSON Streaming + NDJSON (My Choice)

The simplest solution for streaming structured data:

Pros: Standard HTTP, minimal overhead, works everywhere

Cons: Manual chunk buffering, no built-in reconnection

For domain search, JSON Streaming + NDJSON is the sweet spot — maximum performance with minimal complexity.

Conclusion

JSON Streaming is a simple yet powerful technique for improving perceived latency in API responses. By displaying results as they arrive, you create a more responsive user experience without changing your backend architecture.

Whether you're building a domain search tool, a real-time dashboard, or any application with parallel data fetching, JSON Streaming can dramatically improve your user experience.

Try it in your next project — your users will thank you for the speed boost.