Skip to content

Yield to Main: setTimeout vs queueMicrotask vs scheduler.postTask

Published:

I’ve seen many web applications that deliver a poor experience due to long tasks blocking the main thread. An interaction that takes hundreds of milliseconds to respond, scroll that feels sluggish, an interface that seems frozen. The problem is usually the same: JavaScript code that doesn’t yield control back to the browser.

Interaction to Next Paint (INP) is now a critical Core Web Vital, and long tasks (tasks that last more than 50ms) directly degrade it. The solution lies in a technique called “yield to main”: splitting work into chunks and allowing the browser to render and respond to interactions between them.

In this article we’ll compare the three ways to yield in JavaScript: setTimeout, queueMicrotask and scheduler.postTask. With measurable examples, real-world cases and a decision tree to know which one to use for each scenario.

Table of Contents

Open Table of Contents

The Problem: Long Tasks and Their Impact on INP

When you run JavaScript code that takes more than 50ms, you block the main thread. During that time:

This directly impacts INP (Interaction to Next Paint), which measures the time from when someone interacts with our site to when they see visual feedback.

// ❌ Bad: Long task that blocks the main thread
function processLargeDataset(data) {
  const results = [];
  for (let i = 0; i < data.length; i++) {
    // Heavy operation that runs 10,000 times
    results.push(heavyComputation(data[i]));
  }
  return results;
}

// If data.length = 10,000 and each iteration takes 0.05ms
// → Long task of 500ms
// → Degraded INP
// → Terrible user experience

Real measurement with Performance API:

const start = performance.now();
processLargeDataset(data);
const duration = performance.now() - start;
console.log(`Long task: ${duration}ms`); // → Long task: 487ms

Fundamentals: The Event Loop

Before talking about solutions, you need to understand how JavaScript decides what to execute and when.

Event Loop Anatomy

Key points:

  1. Call Stack: Synchronous code that executes immediately
  2. Microtask Queue: Promises, queueMicrotask() - run before render
  3. Macrotask Queue: setTimeout, events, I/O - run after render
  4. Render Steps: The browser updates the UI when the call stack is empty

Execution order example

console.log("1: Synchronous");

setTimeout(() => {
  console.log("4: Macrotask (setTimeout)");
}, 0);

Promise.resolve().then(() => {
  console.log("3: Microtask (Promise)");
});

queueMicrotask(() => {
  console.log("3b: Microtask (queueMicrotask)");
});

console.log("2: Synchronous");

// Output:
// 1: Synchronous
// 2: Synchronous
// 3: Microtask (Promise)
// 3b: Microtask (queueMicrotask)
// 4: Macrotask (setTimeout)

Why this order?

Strategy 1: setTimeout(fn, 0)

How it works

setTimeout schedules a macrotask: your function runs after:

  1. The call stack is empty
  2. All microtasks have been processed
  3. The browser has had the opportunity to render
function yieldWithSetTimeout() {
  return new Promise(resolve => {
    setTimeout(resolve, 0);
  });
}

Practical example with measurement

// ❌ Blocking version
function processDataBlocking(items) {
  const results = [];
  const start = performance.now();

  for (let i = 0; i < items.length; i++) {
    results.push(processItem(items[i]));
  }

  const duration = performance.now() - start;
  console.log(`Long task: ${duration.toFixed(2)}ms`);
  // → Long task: 487.32ms

  return results;
}

// ✅ Version with setTimeout (yield to main)
async function processDataWithYield(items) {
  const results = [];
  const CHUNK_SIZE = 100;
  const startTotal = performance.now();
  let taskCount = 0;

  for (let i = 0; i < items.length; i += CHUNK_SIZE) {
    const chunkStart = performance.now();

    // Process chunk
    const chunk = items.slice(i, i + CHUNK_SIZE);
    for (const item of chunk) {
      results.push(processItem(item));
    }

    const chunkDuration = performance.now() - chunkStart;
    console.log(`Chunk ${++taskCount}: ${chunkDuration.toFixed(2)}ms`);

    // Yield to main
    await new Promise(resolve => setTimeout(resolve, 0));
  }

  const totalDuration = performance.now() - startTotal;
  console.log(`Total: ${totalDuration.toFixed(2)}ms in ${taskCount} chunks`);
  // → Chunk 1: 12.45ms
  // → Chunk 2: 11.89ms
  // → ...
  // → Chunk 24: 12.12ms
  // → Total: 531.23ms in 24 chunks

  return results;
}

Result:

Measurement with Long Task API

// Automatically detect long tasks
if ("PerformanceObserver" in window) {
  const observer = new PerformanceObserver(list => {
    for (const entry of list.getEntries()) {
      console.warn(`⚠️ Long task detected: ${entry.duration.toFixed(2)}ms`);
      console.log("Attribution:", entry.attribution);
    }
  });

  observer.observe({ entryTypes: ["longtask"] });
}

// Blocking version → ⚠️ Long task detected: 487.32ms
// Version with yield → (no warnings)

Pros and cons

✅ Pros:

❌ Cons:

Ideal use case

Use setTimeout(fn, 0) when:

// Example: Render infinite list
async function renderLargeList(items, container) {
  const CHUNK_SIZE = 50;
  const fragment = document.createDocumentFragment();

  for (let i = 0; i < items.length; i += CHUNK_SIZE) {
    const chunk = items.slice(i, i + CHUNK_SIZE);

    // Render chunk
    chunk.forEach(item => {
      const element = createListItem(item);
      fragment.appendChild(element);
    });

    container.appendChild(fragment);

    // Yield to allow scroll, clicks, etc.
    await new Promise(r => setTimeout(r, 0));
  }
}

Strategy 2: queueMicrotask()

How it works

queueMicrotask schedules a microtask: the function runs:

  1. After the current synchronous code
  2. Before any macrotask (setTimeout, events)
  3. Before render

⚠️ queueMicrotask does NOT yield to main for rendering.

function yieldWithMicrotask() {
  return new Promise(resolve => {
    queueMicrotask(resolve);
  });
}

Difference from setTimeout in the Event Loop

Key: Microtasks run in batch until the queue is empty, BEFORE render.

Measurable example

// Direct comparison: setTimeout vs queueMicrotask
async function compareYieldStrategies() {
  const ITERATIONS = 1000;

  // Test 1: setTimeout
  console.time("setTimeout yield");
  for (let i = 0; i < ITERATIONS; i++) {
    await new Promise(r => setTimeout(r, 0));
  }
  console.timeEnd("setTimeout yield");
  // → setTimeout yield: 4,237ms

  // Test 2: queueMicrotask
  console.time("queueMicrotask yield");
  for (let i = 0; i < ITERATIONS; i++) {
    await new Promise(r => queueMicrotask(r));
  }
  console.timeEnd("queueMicrotask yield");
  // → queueMicrotask yield: 18ms
}

Result: queueMicrotask is ~235x faster than setTimeout.

But… does it allow rendering?

// Test: Does the DOM update between chunks?
async function testRenderOpportunity(useSetTimeout = true) {
  let counter = document.getElementById("counter");
  if (!counter) {
    counter = document.createElement("div");
    counter.id = "counter";
    counter.style.cssText =
      "position:fixed;top:10px;left:10px;z-index:9999;padding:10px;background:#333;color:#fff;font-size:20px;font-family:monospace;";
    document.body.appendChild(counter);
  }
  const CHUNKS = 50;

  for (let i = 0; i < CHUNKS; i++) {
    counter.textContent = i;

    if (useSetTimeout) {
      await new Promise(r => setTimeout(r, 0));
    } else {
      await new Promise(r => queueMicrotask(r));
    }
  }
}

// setTimeout → You see the counter increment visually
// queueMicrotask → You only see the final value (49)

When NOT to use (common pitfalls)

Do NOT use queueMicrotask for yield to main

// ❌ This does NOT improve INP
async function processDataWrong(items) {
  for (let i = 0; i < items.length; i += 100) {
    processChunk(items.slice(i, i + 100));
    await new Promise(r => queueMicrotask(r)); // ⚠️ Does not allow render
  }
}

Do NOT create infinite microtask loops

// ❌ This completely freezes the browser
function infiniteMicrotasks() {
  queueMicrotask(() => {
    console.log("Blocking...");
    infiniteMicrotasks(); // Infinite recursion
  });
}

// The browser can never render
// Call stack never fully empties

Do NOT use for heavy operations

// ❌ Still a long task
queueMicrotask(() => {
  // This operation takes 200ms
  const result = heavyComputation();
  updateUI(result);
});

// The microtask blocks render until it completes

Ideal use case

Use queueMicrotask() when:

// ✅ Valid example: Batching state updates
class StateManager {
  constructor() {
    this.state = {};
    this.pendingUpdates = [];
    this.isFlushScheduled = false;
  }

  setState(update) {
    this.pendingUpdates.push(update);

    if (!this.isFlushScheduled) {
      this.isFlushScheduled = true;
      queueMicrotask(() => this.flush());
    }
  }

  applyUpdate(update) {
    Object.assign(this.state, update);
  }

  flush() {
    // Apply all updates in batch
    this.pendingUpdates.forEach(update => this.applyUpdate(update));
    this.pendingUpdates = [];
    this.isFlushScheduled = false;

    // Now render will see a consistent state
  }
}

const stateManager = new StateManager();

// Multiple setState() calls are batched before render
stateManager.setState({ count: 1 });
stateManager.setState({ count: 2 });
stateManager.setState({ count: 3 });
// Only 1 flush in the microtask queue
// Render directly sees count: 3

Strategy 3: scheduler.postTask()

Modern API with priority levels

scheduler.postTask() is the modern API designed specifically for scheduling with priorities.

if ("scheduler" in window && "postTask" in scheduler) {
  await scheduler.postTask(
    () => {
      // Code here
    },
    {
      priority: "user-visible", // user-blocking | user-visible | background
      delay: 0, // Optional: delay in ms
      signal: abortController.signal, // Optional: to cancel
    }
  );
}

Priority levels

Example with different priorities

async function demonstratePriorities() {
  console.log("Start");

  // Low priority: analytics
  scheduler.postTask(() => console.log("4: Background - Analytics sent"), {
    priority: "background",
  });

  // High priority: response to interaction
  scheduler.postTask(() => console.log("2: User-blocking - Button clicked"), {
    priority: "user-blocking",
  });

  // Medium priority: update UI
  scheduler.postTask(() => console.log("3: User-visible - UI updated"), {
    priority: "user-visible",
  });

  console.log("1: Synchronous");

  // Output:
  // Start
  // 1: Synchronous
  // 2: User-blocking - Button clicked
  // 3: User-visible - UI updated
  // 4: Background - Analytics sent
}

Practical example: Processing with priorities

async function processDataWithPriorities(items) {
  const CHUNK_SIZE = 100;
  const results = [];

  // Chunks 1-3: High priority (above-the-fold content)
  for (let i = 0; i < Math.min(300, items.length); i += CHUNK_SIZE) {
    await scheduler.postTask(
      () => {
        const chunk = items.slice(i, i + CHUNK_SIZE);
        results.push(...chunk.map(processItem));
      },
      { priority: "user-blocking" }
    );
  }

  console.log("Critical content rendered");

  // Rest: Medium priority
  for (let i = 300; i < items.length; i += CHUNK_SIZE) {
    await scheduler.postTask(
      () => {
        const chunk = items.slice(i, i + CHUNK_SIZE);
        results.push(...chunk.map(processItem));
      },
      { priority: "user-visible" }
    );
  }

  // Analytics: Low priority
  scheduler.postTask(
    () => {
      sendAnalytics("data_processed", { count: items.length });
    },
    { priority: "background" }
  );

  return results;
}

Task cancellation

async function cancellableTask() {
  const controller = new TaskController();

  const taskPromise = scheduler.postTask(
    async () => {
      for (let i = 0; i < 1000; i++) {
        await processItem(i);

        // Check if aborted
        if (controller.signal.aborted) {
          console.log("Task cancelled at iteration", i);
          return;
        }
      }
    },
    {
      priority: "background",
      signal: controller.signal,
    }
  );

  // Cancel after 100ms
  setTimeout(() => {
    controller.abort();
  }, 100);

  try {
    await taskPromise;
  } catch (error) {
    if (error.name === "AbortError") {
      console.log("Task was aborted");
    }
  }
}

Polyfill / Fallback

// Basic polyfill for browsers without support
if (!window.scheduler?.postTask) {
  window.scheduler = {
    postTask(callback, options = {}) {
      const { priority = "user-visible", delay = 0, signal } = options;

      return new Promise((resolve, reject) => {
        // Check if already aborted
        if (signal?.aborted) {
          reject(new DOMException("Aborted", "AbortError"));
          return;
        }

        // Map priority to setTimeout delay
        const priorityDelays = {
          "user-blocking": Math.max(0, delay),
          "user-visible": Math.max(1, delay),
          background: Math.max(10, delay),
        };

        const actualDelay = priorityDelays[priority] || delay;

        const timeoutId = setTimeout(() => {
          try {
            const result = callback();
            resolve(result);
          } catch (error) {
            reject(error);
          }
        }, actualDelay);

        // Handle abort
        signal?.addEventListener("abort", () => {
          clearTimeout(timeoutId);
          reject(new DOMException("Aborted", "AbortError"));
        });
      });
    },
  };

  window.TaskController = class TaskController {
    constructor() {
      this.signal = {
        aborted: false,
        listeners: [],
        addEventListener(event, fn) {
          if (event === "abort") {
            this.listeners.push(fn);
          }
        },
      };
    }

    abort() {
      if (this.signal.aborted) return;
      this.signal.aborted = true;
      this.signal.listeners.forEach(fn => fn());
    }
  };
}

Compatibility

// Feature detection
function getSchedulerSupport() {
  return {
    postTask: "scheduler" in window && "postTask" in scheduler,
    priorities: "scheduler" in window && "postTask" in scheduler,
    cancellation: typeof TaskController !== "undefined",
  };
}

const support = getSchedulerSupport();
console.log("scheduler.postTask support:", support);

// Progressive fallback
async function yieldToMain() {
  if (window.scheduler?.postTask) {
    return scheduler.postTask(() => {}, { priority: "user-visible" });
  }

  // Fallback to setTimeout
  return new Promise(resolve => setTimeout(resolve, 0));
}

Current status (2026):

Ideal use case

Use scheduler.postTask() when:

// Example: Progressive rendering with priorities
async function renderDashboard(data) {
  // 1. Critical: Header and navigation
  await scheduler.postTask(() => renderHeader(data.user), {
    priority: "user-blocking",
  });

  // 2. Important: Main content
  await scheduler.postTask(() => renderMainContent(data.metrics), {
    priority: "user-visible",
  });

  // 3. Secondary: Sidebar
  await scheduler.postTask(() => renderSidebar(data.widgets), {
    priority: "user-visible",
  });

  // 4. Low priority: Analytics and prefetch
  scheduler.postTask(
    () => {
      trackPageView();
      prefetchRelatedData();
    },
    { priority: "background" }
  );
}

Practical Comparison

Comparison table

FeaturesetTimeout(fn, 0)queueMicrotask(fn)scheduler.postTask(fn)
Queue typeMacrotask QueueMicrotask QueueTask Queue (prioritized)
When it runsAfter renderBefore renderConfigurable by priority
Allows yield to main✅ Yes❌ No✅ Yes
Improves INP✅✅✅ Excellent❌ Not applicable✅✅✅ Excellent
Overhead~1-4ms per call<0.1ms per call~1-5ms per call
Priority control❌ No❌ No✅✅✅ 3 levels
Cancellable✅ With clearTimeout()❌ No✅ With AbortSignal
Compatibility✅✅✅ Universal✅✅ Modern (IE11+)⚠️ Chrome/Edge only
Minimum delay~4ms (spec)0ms (immediate)0-10ms (by priority)
Battery usageMediumLowMedium-High
Ideal forGeneral chunkingState batchingProgressive rendering

Benchmark: All three techniques

// Full benchmark comparing the 3 strategies
async function benchmarkYieldStrategies() {
  const DATA_SIZE = 10000;
  const CHUNK_SIZE = 100;
  const data = Array.from({ length: DATA_SIZE }, (_, i) => i);

  // Helper: simulate heavy work
  function heavyWork(item) {
    let result = item;
    for (let i = 0; i < 1000; i++) {
      result = Math.sqrt(result + i);
    }
    return result;
  }

  // Test 1: No yield (baseline)
  console.log("\n=== Test 1: No yield (blocking) ===");
  const t1Start = performance.now();
  const results1 = data.map(heavyWork);
  const t1Duration = performance.now() - t1Start;
  console.log(`Total time: ${t1Duration.toFixed(2)}ms`);
  console.log(`Long tasks: 1 (${t1Duration.toFixed(2)}ms)`);

  // Test 2: setTimeout yield
  console.log("\n=== Test 2: setTimeout(fn, 0) ===");
  const t2Start = performance.now();
  const results2 = [];

  for (let i = 0; i < DATA_SIZE; i += CHUNK_SIZE) {
    const chunk = data.slice(i, i + CHUNK_SIZE);
    results2.push(...chunk.map(heavyWork));
    await new Promise(r => setTimeout(r, 0));
  }

  const t2Duration = performance.now() - t2Start;
  const chunks2 = Math.ceil(DATA_SIZE / CHUNK_SIZE);
  console.log(`Total time: ${t2Duration.toFixed(2)}ms`);
  console.log(
    `Overhead: +${(t2Duration - t1Duration).toFixed(2)}ms (+${(((t2Duration - t1Duration) / t1Duration) * 100).toFixed(1)}%)`
  );
  console.log(
    `Tasks created: ${chunks2} (~${(t1Duration / chunks2).toFixed(2)}ms each)`
  );

  // Test 3: queueMicrotask yield
  console.log("\n=== Test 3: queueMicrotask(fn) ===");
  const t3Start = performance.now();
  const results3 = [];

  for (let i = 0; i < DATA_SIZE; i += CHUNK_SIZE) {
    const chunk = data.slice(i, i + CHUNK_SIZE);
    results3.push(...chunk.map(heavyWork));
    await new Promise(r => queueMicrotask(r));
  }

  const t3Duration = performance.now() - t3Start;
  const chunks3 = Math.ceil(DATA_SIZE / CHUNK_SIZE);
  console.log(`Total time: ${t3Duration.toFixed(2)}ms`);
  console.log(
    `Overhead: +${(t3Duration - t1Duration).toFixed(2)}ms (+${(((t3Duration - t1Duration) / t1Duration) * 100).toFixed(1)}%)`
  );
  console.log(`⚠️ Does NOT allow render between chunks`);
  console.log(`Equivalent long task: ${t3Duration.toFixed(2)}ms`);

  // Test 4: scheduler.postTask yield
  let t4Duration;
  if (window.scheduler?.postTask) {
    console.log("\n=== Test 4: scheduler.postTask(fn) ===");
    const t4Start = performance.now();
    const results4 = [];

    for (let i = 0; i < DATA_SIZE; i += CHUNK_SIZE) {
      await scheduler.postTask(
        () => {
          const chunk = data.slice(i, i + CHUNK_SIZE);
          results4.push(...chunk.map(heavyWork));
        },
        { priority: "user-visible" }
      );
    }

    t4Duration = performance.now() - t4Start;
    const chunks4 = Math.ceil(DATA_SIZE / CHUNK_SIZE);
    console.log(`Total time: ${t4Duration.toFixed(2)}ms`);
    console.log(
      `Overhead: +${(t4Duration - t1Duration).toFixed(2)}ms (+${(((t4Duration - t1Duration) / t1Duration) * 100).toFixed(1)}%)`
    );
    console.log(
      `Tasks created: ${chunks4} (~${(t1Duration / chunks4).toFixed(2)}ms each)`
    );
  }

  // Summary
  console.log("\n=== Summary ===");
  console.log(`Blocking: ${t1Duration.toFixed(2)}ms (baseline)`);
  console.log(
    `setTimeout: ${t2Duration.toFixed(2)}ms (+${(((t2Duration - t1Duration) / t1Duration) * 100).toFixed(1)}%)`
  );
  console.log(
    `queueMicrotask: ${t3Duration.toFixed(2)}ms (+${(((t3Duration - t1Duration) / t1Duration) * 100).toFixed(1)}%) ⚠️ NO yield`
  );
  if (window.scheduler?.postTask) {
    console.log(
      `scheduler.postTask: ${t4Duration.toFixed(2)}ms (+${(((t4Duration - t1Duration) / t1Duration) * 100).toFixed(1)}%)`
    );
  }
}

// Run benchmark
benchmarkYieldStrategies();

/* Expected output:
=== Test 1: No yield (blocking) ===
Total time: 487.32ms
Long tasks: 1 (487.32ms)

=== Test 2: setTimeout(fn, 0) ===
Total time: 531.45ms
Overhead: +44.13ms (+9.1%)
Tasks created: 100 (~4.87ms each)

=== Test 3: queueMicrotask(fn) ===
Total time: 489.21ms
Overhead: +1.89ms (+0.4%)
⚠️ Does NOT allow render between chunks
Equivalent long task: 489.21ms

=== Test 4: scheduler.postTask(fn) ===
Total time: 528.67ms
Overhead: +41.35ms (+8.5%)
Tasks created: 100 (~4.87ms each)

=== Summary ===
Blocking: 487.32ms (baseline)
setTimeout: 531.45ms (+9.1%)
queueMicrotask: 489.21ms (+0.4%) ⚠️ NO yield
scheduler.postTask: 528.67ms (+8.5%)
*/

INP Impact

// Measure real INP with the 3 strategies
async function measureINPImpact(strategy = "setTimeout") {
  const DATA_SIZE = 5000;
  const CHUNK_SIZE = 100;
  const data = Array.from({ length: DATA_SIZE }, (_, i) => i);

  function heavyWork(item) {
    let result = item;
    for (let i = 0; i < 1000; i++) {
      result = Math.sqrt(result + i);
    }
    return result;
  }

  // Simulate interaction during processing
  let clickResponse = null;
  let button = document.getElementById("test-button");
  if (!button) {
    button = document.createElement("button");
    button.id = "test-button";
    button.textContent = "Click me during processing!";
    button.style.cssText =
      "position:fixed;top:10px;right:10px;z-index:9999;padding:10px;font-size:14px;cursor:pointer;";
    document.body.appendChild(button);
  }

  button.addEventListener("click", () => {
    const clickTime = performance.now();
    clickResponse = clickTime;
    button.textContent = "Clicked!";
  });

  console.log("👆 Click the button during processing...");

  const startTime = performance.now();

  // Process according to strategy
  switch (strategy) {
    case "blocking":
      data.forEach(item => heavyWork(item));
      break;

    case "setTimeout":
      for (let i = 0; i < DATA_SIZE; i += CHUNK_SIZE) {
        const chunk = data.slice(i, i + CHUNK_SIZE);
        chunk.forEach(heavyWork);
        await new Promise(r => setTimeout(r, 0));
      }
      break;

    case "queueMicrotask":
      for (let i = 0; i < DATA_SIZE; i += CHUNK_SIZE) {
        const chunk = data.slice(i, i + CHUNK_SIZE);
        chunk.forEach(heavyWork);
        await new Promise(r => queueMicrotask(r));
      }
      break;

    case "scheduler":
      for (let i = 0; i < DATA_SIZE; i += CHUNK_SIZE) {
        await scheduler.postTask(
          () => {
            const chunk = data.slice(i, i + CHUNK_SIZE);
            chunk.forEach(heavyWork);
          },
          { priority: "background" }
        );
      }
      break;
  }

  const endTime = performance.now();

  if (clickResponse) {
    // Calculate approximate INP
    const inp = endTime - clickResponse;
    console.log(`\n📊 Estimated INP: ${inp.toFixed(2)}ms`);
    console.log(`✅ Good: <200ms | ⚠️ Needs work: 200-500ms | ❌ Poor: >500ms`);

    if (inp < 200) console.log("✅ Excellent INP");
    else if (inp < 500) console.log("⚠️ INP needs improvement");
    else console.log("❌ Very poor INP");
  }

  console.log(`\nTotal processing: ${(endTime - startTime).toFixed(2)}ms`);
}

Real-World Examples

Example 1: Processing a large array

/**
 * Real case: Import and process a 50MB CSV with 100,000 records
 * Goal: Don't block the UI during processing
 */

// Sample data
const csvData = Array.from({ length: 100000 }, (_, i) => ({
  id: i,
  name: `Product ${i}`,
  price: Math.random() * 1000,
  stock: Math.floor(Math.random() * 100),
}));

// ❌ Blocking version: 1 long task
function processCSVBlocking(data) {
  console.log("Processing CSV (blocking)...");
  const start = performance.now();

  const processed = data.map(row => ({
    ...row,
    priceWithTax: row.price * 1.21,
    inStock: row.stock > 0,
  }));

  const duration = performance.now() - start;
  console.log(`❌ Long task: ${duration.toFixed(2)}ms`);

  return processed;
}

// ✅ Version 1: setTimeout (maximum compatibility)
async function processCSVSetTimeout(data) {
  console.log("Processing CSV (setTimeout)...");
  const start = performance.now();
  const CHUNK_SIZE = 1000;
  const processed = [];

  for (let i = 0; i < data.length; i += CHUNK_SIZE) {
    const chunk = data.slice(i, i + CHUNK_SIZE);

    const chunkProcessed = chunk.map(row => ({
      ...row,
      priceWithTax: row.price * 1.21,
      inStock: row.stock > 0,
    }));

    processed.push(...chunkProcessed);

    // Update progress
    const progress = Math.round((i / data.length) * 100);
    updateProgressBar(progress);

    // Yield to main
    await new Promise(r => setTimeout(r, 0));
  }

  const duration = performance.now() - start;
  console.log(
    `✅ Total: ${duration.toFixed(2)}ms in ${Math.ceil(data.length / CHUNK_SIZE)} chunks`
  );

  return processed;
}

// ✅ Version 2: scheduler.postTask (with priorities)
async function processCSVScheduler(data) {
  if (!window.scheduler?.postTask) {
    console.warn(
      "scheduler.postTask not available, falling back to setTimeout"
    );
    return processCSVSetTimeout(data);
  }

  console.log("Processing CSV (scheduler.postTask)...");
  const start = performance.now();
  const CHUNK_SIZE = 1000;
  const processed = [];

  // First 10% with high priority (preview)
  const previewSize = Math.floor(data.length * 0.1);

  for (let i = 0; i < previewSize; i += CHUNK_SIZE) {
    await scheduler.postTask(
      () => {
        const chunk = data.slice(i, i + CHUNK_SIZE);
        const chunkProcessed = chunk.map(row => ({
          ...row,
          priceWithTax: row.price * 1.21,
          inStock: row.stock > 0,
        }));
        processed.push(...chunkProcessed);
      },
      { priority: "user-visible" }
    );
  }

  console.log("✅ Preview rendered");

  // Rest with background priority
  for (let i = previewSize; i < data.length; i += CHUNK_SIZE) {
    await scheduler.postTask(
      () => {
        const chunk = data.slice(i, i + CHUNK_SIZE);
        const chunkProcessed = chunk.map(row => ({
          ...row,
          priceWithTax: row.price * 1.21,
          inStock: row.stock > 0,
        }));
        processed.push(...chunkProcessed);

        const progress = Math.round((i / data.length) * 100);
        updateProgressBar(progress);
      },
      { priority: "background" }
    );
  }

  const duration = performance.now() - start;
  console.log(`✅ Total: ${duration.toFixed(2)}ms`);

  return processed;
}

// Helper: Update progress bar
function updateProgressBar(percent) {
  const bar = document.getElementById("progress-bar");
  if (bar) bar.style.width = `${percent}%`;
}

// Comparison with Long Task API
async function compareCSVProcessing() {
  const observer = new PerformanceObserver(list => {
    for (const entry of list.getEntries()) {
      console.warn(`⚠️ Long task: ${entry.duration.toFixed(2)}ms`);
    }
  });
  observer.observe({ entryTypes: ["longtask"] });

  console.log("\n=== Test 1: Blocking ===");
  processCSVBlocking(csvData);
  // → ⚠️ Long task: 487.32ms

  await new Promise(r => setTimeout(r, 100));

  console.log("\n=== Test 2: setTimeout ===");
  await processCSVSetTimeout(csvData);
  // → (no long task warnings)

  await new Promise(r => setTimeout(r, 100));

  console.log("\n=== Test 3: scheduler.postTask ===");
  await processCSVScheduler(csvData);
  // → (no long task warnings)
}

Example 2: Async Generator Transformation

/**
 * Real case: Transform async generators to yield to main
 * Original problem: transpiled async/await creates long tasks
 */

// Original function: async function transpiled by Babel/TypeScript
function _asyncToGenerator(fn) {
  return function () {
    var self = this;
    var args = arguments;

    return new Promise(function (resolve, reject) {
      var gen = fn.apply(self, args);

      function _next(value) {
        asyncGeneratorStep(gen, resolve, reject, _next, _throw, "next", value);
      }

      function _throw(err) {
        asyncGeneratorStep(gen, resolve, reject, _next, _throw, "throw", err);
      }

      _next(undefined);
    });
  };
}

function asyncGeneratorStep(gen, resolve, reject, _next, _throw, key, arg) {
  try {
    var info = gen[key](arg);
    var value = info.value;
  } catch (error) {
    reject(error);
    return;
  }

  if (info.done) {
    resolve(value);
  } else {
    // ❌ Here is the problem: no yield to main
    Promise.resolve(value).then(_next, _throw);
  }
}

// ✅ Improved version: With yield to main using setTimeout
function asyncGeneratorStepWithYield(
  gen,
  resolve,
  reject,
  _next,
  _throw,
  key,
  arg
) {
  try {
    var info = gen[key](arg);
    var value = info.value;
  } catch (error) {
    reject(error);
    return;
  }

  if (info.done) {
    resolve(value);
  } else {
    // ✅ Yield to main every N iterations
    Promise.resolve(value).then(val => {
      // Yield every 50ms of work
      if (performance.now() - gen._lastYield > 50) {
        setTimeout(() => {
          gen._lastYield = performance.now();
          _next(val);
        }, 0);
      } else {
        _next(val);
      }
    }, _throw);
  }
}

// ✅ Version with scheduler.postTask (priority control)
function asyncGeneratorStepWithScheduler(
  gen,
  resolve,
  reject,
  _next,
  _throw,
  key,
  arg,
  priority = "user-visible"
) {
  try {
    var info = gen[key](arg);
    var value = info.value;
  } catch (error) {
    reject(error);
    return;
  }

  if (info.done) {
    resolve(value);
  } else {
    Promise.resolve(value).then(async val => {
      if (performance.now() - gen._lastYield > 50) {
        if (window.scheduler?.postTask) {
          await scheduler.postTask(
            () => {
              gen._lastYield = performance.now();
              _next(val);
            },
            { priority }
          );
        } else {
          setTimeout(() => {
            gen._lastYield = performance.now();
            _next(val);
          }, 0);
        }
      } else {
        _next(val);
      }
    }, _throw);
  }
}

// Usage example
async function* heavyGenerator() {
  for (let i = 0; i < 10000; i++) {
    // Simulate heavy work
    const result = expensiveOperation(i);
    yield result;
  }
}

// Consume with yield to main
async function consumeGenerator() {
  console.time("Generator with yield");

  const gen = heavyGenerator();
  gen._lastYield = performance.now();

  const results = [];
  for await (const value of gen) {
    results.push(value);
  }

  console.timeEnd("Generator with yield");
  console.log(`Processed ${results.length} items without long tasks`);
}

Example 3: React Reconciliation (conceptual)

/**
 * How React uses scheduling to keep the UI responsive
 * (Conceptual simplification of React Fiber + Scheduler)
 */

// React uses scheduler.postTask internally (when available)
// with a similar priority system

const ReactPriorities = {
  ImmediatePriority: 1, // user-blocking
  UserBlockingPriority: 2, // user-visible
  NormalPriority: 3, // user-visible
  LowPriority: 4, // background
  IdlePriority: 5, // background
};

// Simplified React Fiber workloop
class ReactScheduler {
  constructor() {
    this.workQueue = [];
    this.isWorking = false;
  }

  scheduleWork(work, priority) {
    this.workQueue.push({ work, priority });
    this.workQueue.sort((a, b) => a.priority - b.priority);

    if (!this.isWorking) {
      this.performWork();
    }
  }

  async performWork() {
    this.isWorking = true;
    let startTime = performance.now();

    while (this.workQueue.length > 0) {
      const { work, priority } = this.workQueue.shift();

      // Execute a "unit of work"
      const deadline = this.getDeadlineForPriority(priority);
      const shouldYield = performance.now() - startTime > deadline;

      if (shouldYield) {
        // ✅ Yield to main
        await this.yieldToMain(priority);
        startTime = performance.now();
      }

      // Process the work
      work();
    }

    this.isWorking = false;
  }

  getDeadlineForPriority(priority) {
    switch (priority) {
      case ReactPriorities.ImmediatePriority:
        return -1; // No yield, execute immediately
      case ReactPriorities.UserBlockingPriority:
        return 250; // Yield after 250ms
      case ReactPriorities.NormalPriority:
        return 5000;
      case ReactPriorities.LowPriority:
        return 10000;
      case ReactPriorities.IdlePriority:
        return Infinity;
      default:
        return 5000;
    }
  }

  async yieldToMain(priority) {
    if (window.scheduler?.postTask) {
      const schedulerPriority = this.mapToSchedulerPriority(priority);
      await scheduler.postTask(() => {}, { priority: schedulerPriority });
    } else {
      // Fallback: setTimeout
      await new Promise(r => setTimeout(r, 0));
    }
  }

  mapToSchedulerPriority(reactPriority) {
    if (reactPriority <= ReactPriorities.UserBlockingPriority) {
      return "user-blocking";
    } else if (reactPriority === ReactPriorities.NormalPriority) {
      return "user-visible";
    } else {
      return "background";
    }
  }
}

// Example: Render large list with React scheduling
const reactScheduler = new ReactScheduler();

function renderLargeList(items) {
  const container = document.getElementById("list");
  const CHUNK_SIZE = 50;

  for (let i = 0; i < items.length; i += CHUNK_SIZE) {
    const chunk = items.slice(i, i + CHUNK_SIZE);

    // High priority for visible content
    const priority =
      i < 200
        ? ReactPriorities.UserBlockingPriority
        : ReactPriorities.NormalPriority;

    reactScheduler.scheduleWork(() => {
      const fragment = document.createDocumentFragment();
      chunk.forEach(item => {
        const element = createListItem(item);
        fragment.appendChild(element);
      });
      container.appendChild(fragment);
    }, priority);
  }
}

// Result: UI remains responsive during rendering

Measurement and Debugging

DevTools Performance Panel

/**
 * Snippet to paste in DevTools Console
 * Measures the impact of yield strategies in real-time
 */

(async function measureYieldImpact() {
  console.clear();
  console.log("🔧 Yield Strategy Measurement Tool\n");

  // Configuration
  const DATA_SIZE = 5000;
  const CHUNK_SIZE = 100;
  const data = Array.from({ length: DATA_SIZE }, (_, i) => i);

  // Simulated heavy work
  function heavyWork(n) {
    let result = n;
    for (let i = 0; i < 10000; i++) {
      result = Math.sqrt(result + i);
    }
    return result;
  }

  // Observer for long tasks
  const longTasks = [];
  const observer = new PerformanceObserver(list => {
    for (const entry of list.getEntries()) {
      longTasks.push(entry.duration);
    }
  });
  observer.observe({ entryTypes: ["longtask"] });

  // Test runner
  async function test(name, yieldFn) {
    longTasks.length = 0;
    console.log(`\n📊 Testing: ${name}`);

    // Mark start
    performance.mark(`${name}-start`);

    for (let i = 0; i < DATA_SIZE; i += CHUNK_SIZE) {
      const chunk = data.slice(i, i + CHUNK_SIZE);
      chunk.forEach(heavyWork);

      if (yieldFn) {
        await yieldFn();
      }
    }

    // Mark end
    performance.mark(`${name}-end`);
    performance.measure(name, `${name}-start`, `${name}-end`);

    // Results
    const measure = performance.getEntriesByName(name)[0];
    console.log(`  ⏱️  Total: ${measure.duration.toFixed(2)}ms`);
    console.log(`  📦 Chunks: ${Math.ceil(DATA_SIZE / CHUNK_SIZE)}`);
    console.log(`  ⚠️  Long tasks: ${longTasks.length}`);

    if (longTasks.length > 0) {
      const avgLongTask =
        longTasks.reduce((a, b) => a + b, 0) / longTasks.length;
      console.log(`  📏 Avg long task: ${avgLongTask.toFixed(2)}ms`);
    }

    // Clean up marks
    performance.clearMarks(`${name}-start`);
    performance.clearMarks(`${name}-end`);
    performance.clearMeasures(name);

    return measure.duration;
  }

  // Run tests
  const results = {};

  // 1. No yield
  results.blocking = await test("Blocking (no yield)", null);
  await new Promise(r => setTimeout(r, 500));

  // 2. setTimeout
  results.setTimeout = await test("setTimeout", () =>
    new Promise(r => setTimeout(r, 0)));
  await new Promise(r => setTimeout(r, 500));

  // 3. queueMicrotask
  results.queueMicrotask = await test("queueMicrotask", () =>
    new Promise(r => queueMicrotask(r)));
  await new Promise(r => setTimeout(r, 500));

  // 4. scheduler.postTask (if available)
  if (window.scheduler?.postTask) {
    results.scheduler = await test("scheduler.postTask", () =>
      scheduler.postTask(() => {}, { priority: "user-visible" }));
  }

  // Summary
  console.log("\n" + "=".repeat(50));
  console.log("📊 SUMMARY");
  console.log("=".repeat(50));

  Object.entries(results).forEach(([name, duration]) => {
    const overhead = duration - results.blocking;
    const overheadPercent = (overhead / results.blocking) * 100;
    console.log(
      `${name.padEnd(20)} ${duration.toFixed(2)}ms (+${overheadPercent.toFixed(1)}%)`
    );
  });

  observer.disconnect();
  console.log("\n✅ Measurement complete");
})();

How to use:

  1. Open DevTools
  2. Go to the “Console” tab
  3. Copy and paste the snippet
  4. Press Enter
  5. Switch to the “Performance” tab and observe the flamegraph

Identify long tasks visually

// Add visual overlay for long tasks
class LongTaskVisualizer {
  constructor() {
    this.overlay = this.createOverlay();
    this.observer = new PerformanceObserver(list => {
      for (const entry of list.getEntries()) {
        this.showLongTask(entry);
      }
    });
    this.observer.observe({ entryTypes: ["longtask"] });
  }

  createOverlay() {
    const overlay = document.createElement("div");
    overlay.style.cssText = `
      position: fixed;
      top: 0;
      right: 0;
      padding: 10px;
      background: rgba(255, 0, 0, 0.8);
      color: white;
      font-family: monospace;
      font-size: 12px;
      z-index: 9999;
      display: none;
    `;
    document.body.appendChild(overlay);
    return overlay;
  }

  showLongTask(entry) {
    this.overlay.textContent = `⚠️ Long Task: ${entry.duration.toFixed(2)}ms`;
    this.overlay.style.display = "block";

    setTimeout(() => {
      this.overlay.style.display = "none";
    }, 2000);

    console.warn(`Long task detected:`, {
      duration: entry.duration,
      startTime: entry.startTime,
      attribution: entry.attribution,
    });
  }

  disconnect() {
    this.observer.disconnect();
    this.overlay.remove();
  }
}

// Activate visualizer
const visualizer = new LongTaskVisualizer();

// To deactivate: visualizer.disconnect();

Performance Marks and Measures

// Detailed measurement system
class PerformanceTracker {
  constructor(name) {
    this.name = name;
    this.metrics = {
      chunks: [],
      yields: 0,
      longTasks: 0,
      totalDuration: 0,
    };
  }

  startChunk(chunkId) {
    performance.mark(`${this.name}-chunk-${chunkId}-start`);
  }

  endChunk(chunkId) {
    performance.mark(`${this.name}-chunk-${chunkId}-end`);
    performance.measure(
      `${this.name}-chunk-${chunkId}`,
      `${this.name}-chunk-${chunkId}-start`,
      `${this.name}-chunk-${chunkId}-end`
    );

    const measure = performance.getEntriesByName(
      `${this.name}-chunk-${chunkId}`
    )[0];
    this.metrics.chunks.push(measure.duration);

    if (measure.duration > 50) {
      this.metrics.longTasks++;
    }
  }

  recordYield() {
    this.metrics.yields++;
  }

  start() {
    performance.mark(`${this.name}-start`);
  }

  end() {
    performance.mark(`${this.name}-end`);
    performance.measure(this.name, `${this.name}-start`, `${this.name}-end`);

    const measure = performance.getEntriesByName(this.name)[0];
    this.metrics.totalDuration = measure.duration;
  }

  report() {
    const avgChunkDuration =
      this.metrics.chunks.reduce((a, b) => a + b, 0) /
      this.metrics.chunks.length;
    const maxChunkDuration = Math.max(...this.metrics.chunks);

    console.log(`\n📊 Performance Report: ${this.name}`);
    console.log("".repeat(50));
    console.log(`Total duration: ${this.metrics.totalDuration.toFixed(2)}ms`);
    console.log(`Chunks: ${this.metrics.chunks.length}`);
    console.log(`Yields: ${this.metrics.yields}`);
    console.log(`Long tasks: ${this.metrics.longTasks}`);
    console.log(`Avg chunk: ${avgChunkDuration.toFixed(2)}ms`);
    console.log(`Max chunk: ${maxChunkDuration.toFixed(2)}ms`);
    console.log("".repeat(50));

    return this.metrics;
  }

  clear() {
    // Clean all marks and measures
    this.metrics.chunks.forEach((_, i) => {
      performance.clearMarks(`${this.name}-chunk-${i}-start`);
      performance.clearMarks(`${this.name}-chunk-${i}-end`);
      performance.clearMeasures(`${this.name}-chunk-${i}`);
    });
    performance.clearMarks(`${this.name}-start`);
    performance.clearMarks(`${this.name}-end`);
    performance.clearMeasures(this.name);
  }
}

// Usage
async function processWithTracking(data, yieldStrategy) {
  const tracker = new PerformanceTracker(`Process-${yieldStrategy}`);
  tracker.start();

  const CHUNK_SIZE = 100;

  for (let i = 0; i < data.length; i += CHUNK_SIZE) {
    const chunkId = Math.floor(i / CHUNK_SIZE);
    tracker.startChunk(chunkId);

    const chunk = data.slice(i, i + CHUNK_SIZE);
    chunk.forEach(processItem);

    tracker.endChunk(chunkId);

    // Yield according to strategy
    switch (yieldStrategy) {
      case "setTimeout":
        await new Promise(r => setTimeout(r, 0));
        break;
      case "scheduler":
        await scheduler.postTask(() => {}, { priority: "user-visible" });
        break;
    }

    tracker.recordYield();
  }

  tracker.end();
  const report = tracker.report();
  tracker.clear();

  return report;
}

Integration with Real User Monitoring (RUM)

// Send metrics to your RUM system
class RUMReporter {
  constructor(endpoint) {
    this.endpoint = endpoint;
    this.metrics = [];

    // Observer for INP
    if ("PerformanceObserver" in window) {
      this.observeINP();
      this.observeLongTasks();
    }
  }

  observeINP() {
    const observer = new PerformanceObserver(list => {
      for (const entry of list.getEntries()) {
        if (entry.name === "first-input") {
          this.metrics.push({
            type: "inp",
            value: entry.processingStart - entry.startTime,
            timestamp: Date.now(),
          });
        }
      }
    });

    observer.observe({
      type: "event",
      buffered: true,
      durationThreshold: 0,
    });
  }

  observeLongTasks() {
    const observer = new PerformanceObserver(list => {
      for (const entry of list.getEntries()) {
        this.metrics.push({
          type: "longtask",
          duration: entry.duration,
          attribution: entry.attribution?.[0]?.name || "unknown",
          timestamp: Date.now(),
        });
      }
    });

    observer.observe({ entryTypes: ["longtask"] });
  }

  async flush() {
    if (this.metrics.length === 0) return;

    const payload = {
      url: window.location.href,
      userAgent: navigator.userAgent,
      metrics: this.metrics,
    };

    try {
      await fetch(this.endpoint, {
        method: "POST",
        headers: { "Content-Type": "application/json" },
        body: JSON.stringify(payload),
        keepalive: true, // Important for sending before unload
      });

      this.metrics = [];
    } catch (error) {
      console.error("RUM reporting failed:", error);
    }
  }
}

// Initialize
const rum = new RUMReporter("https://your-rum-endpoint.com/metrics");

// Flush when leaving the page
window.addEventListener("visibilitychange", () => {
  if (document.visibilityState === "hidden") {
    rum.flush();
  }
});

Decision Tree: Which Technique to Use?

Key questions to decide

  1. Do you need to yield to main to improve INP?

    • YES → Discard queueMicrotask, consider setTimeout or scheduler.postTask
    • NOqueueMicrotask is valid for state batching
  2. Do you need priority control?

    • YESscheduler.postTask (with fallback to setTimeout)
    • NOsetTimeout(fn, 0) is sufficient
  3. How critical is browser support?

    • UniversalsetTimeout (works everywhere)
    • Modernscheduler.postTask with polyfill
  4. How long does the task take?

    • <50ms → No yield needed, run directly
    • 50-200ms → Split into 2-4 chunks with setTimeout
    • >200ms → Use scheduler.postTask with priorities
  5. Does the task block user interactions?

    • YES → Use user-blocking priority
    • NO → Use user-visible or background priority

Quick decision table

Use caseRecommended techniqueAlternative
Process large arraysetTimeout + chunkingscheduler.postTask (background)
Progressive list renderscheduler.postTask (user-visible)setTimeout + chunking
Respond to clickscheduler.postTask (user-blocking)Synchronous code if <50ms
State batchingqueueMicrotaskPromise.resolve().then()
CSV importsetTimeout + progressscheduler.postTask (background)
Frame-by-frame animationrequestAnimationFrameN/A (no yield)
Analytics / trackingscheduler.postTask (background)setTimeout with delay
Resource preloadingrequestIdleCallbackscheduler.postTask (background)
Form validationSynchronous codescheduler.postTask (user-blocking)
Search indexingscheduler.postTask (background)requestIdleCallback

Conclusions

After analyzing the three strategies, here are the key takeaways:

Summary of when to use each technique

setTimeout(fn, 0) - The reliable classic

queueMicrotask(fn) - Fast but dangerous

scheduler.postTask(fn) - Modern and powerful

Recommendations to optimize INP

  1. Measure first, optimize later

    • Use Long Task API to detect problems
    • The Performance panel in DevTools is your best friend
    • Set a budget: no task over 50ms
  2. Split large tasks into 16-50ms chunks

    • 16ms = 1 frame at 60fps
    • 50ms = long task threshold
    • Use yieldToMain() between chunks
  3. Prioritize critical work

    • First paint: user-blocking
    • Visible content: user-visible
    • Analytics/tracking: background
  4. Don’t abuse microtasks

    • queueMicrotask does not improve INP
    • Use it only for state consistency
    • Watch out for infinite loops
  5. Have a fallback plan

    • Polyfill for scheduler.postTask
    • Always use feature detection
    • setTimeout as a safety net

Practical implementation

// Universal helper for yield to main
async function yieldToMain(priority = "user-visible") {
  // Preference: scheduler.postTask
  if (window.scheduler?.postTask) {
    return scheduler.postTask(() => {}, { priority });
  }

  // Fallback: setTimeout
  return new Promise(resolve => setTimeout(resolve, 0));
}

// Use in your code
async function processLargeTask(data) {
  const CHUNK_SIZE = 100;
  const results = [];

  for (let i = 0; i < data.length; i += CHUNK_SIZE) {
    // Process chunk
    const chunk = data.slice(i, i + CHUNK_SIZE);
    results.push(...chunk.map(processItem));

    // Yield to main
    await yieldToMain("user-visible");
  }

  return results;
}

Additional resources

Official documentation:

Web Performance:

Event Loop:

Visualizers:

Next Steps

To improve your application’s INP:

  1. Audit your current code

    • Run the included DevTools snippet
    • Identify long tasks with Performance Observer
    • Measure real INP with Web Vitals
  2. Implement yielding progressively

    • Start with the heaviest tasks
    • Use setTimeout for maximum compatibility
    • Migrate to scheduler.postTask when it’s stable
  3. Monitor in production

    • Integrate RUM for real metrics
    • Set alerts for INP >200ms
    • A/B test different strategies
  4. Share your experience

    • Have you optimized INP in your project?
    • Which technique worked best for your use case?
    • Let’s talk on LinkedIn, Bluesky or X

Last updated: 2026-02-18


Previous Post
Practical guide to the <img> element: from the basics to LCP
Next Post
Optimizing Consent Scripts for Core Web Vitals