Software Development

My Ultimate 2025 Fix: I Built Doddle for Async Iteration

Tired of complex async loops? Discover Doddle, a new JavaScript utility I built for simple, powerful async iteration with controlled concurrency and error handling.

A

Alex Miller

Senior JavaScript engineer specializing in asynchronous patterns and performance optimization.

6 min read3 views

The Asynchronous Headache We All Face

If you're a JavaScript or Node.js developer, you live and breathe asynchronous operations. Fetching data, writing to files, querying databases—it's the bedrock of modern applications. But let's be honest: iterating over async tasks can be a real pain. For years, I found myself writing the same boilerplate code, wrestling with the same frustrating trade-offs. It felt like I was constantly choosing between simplicity, performance, and resilience. I couldn't have all three. This frustration became my primary motivation for what I consider my ultimate fix for 2025: a library I built called Doddle.

Before we dive into the solution, let's revisit the core problems that plague our standard async toolset.

The Slowness of Sequential Loops: `for...await...of`

The `for...await...of` loop was a fantastic addition to JavaScript. It brought synchronous-style clarity to asynchronous iteration. It's readable, simple, and perfect for tasks that must happen one after another.

But what if they don't? Imagine you need to process 100 images from a list. Each processing task is independent. Using `for...await...of` means you process image #1, wait, process image #2, wait, and so on. You're leaving a massive amount of performance on the table by not running tasks in parallel. Your multi-core processor sits idle while your script plods along a single-file line.

The Brittleness of `Promise.all`

Okay, so we need parallelism. The natural next step is `Promise.all`. You can map your list of 100 images to an array of 100 promises and throw them at `Promise.all`. Now we're cooking with gas! All 100 tasks kick off concurrently.

But this approach has two major flaws:

  1. All or Nothing: If just one of those 100 promises rejects, the entire `Promise.all` call rejects immediately. You lose the results of the 99 successful tasks. This makes it incredibly fragile for bulk operations where partial failure is acceptable.
  2. The Thundering Herd: There's no concurrency control. Kicking off 100 (or 10,000) requests at once can overwhelm your own server, crash your process due to memory limits, or get your IP address banned by the API you're calling. It's a sledgehammer when you often need a scalpel.

We have `Promise.allSettled`, which helps with the error handling, but it still lacks any form of concurrency or rate-limiting control.

My 2025 Fix: Introducing Doddle

After writing countless custom batching functions and wrestling with overly complex libraries, I decided to build the tool I always wanted. I call it Doddle, because handling complex async iteration should be, well, a doddle.

Doddle is a lightweight, zero-dependency utility that provides a fluent, chainable API for iterating over asynchronous tasks with fine-grained control over concurrency, error handling, and rate limiting. It combines the readability of a `for` loop with the power of controlled parallel execution.

Here's a taste of what it looks like:

import doddle from 'doddle';

const urls = ['/users/1', '/users/2', '/users/3', ...];

const { results, errors } = await doddle(urls)
  .concurrency(5) // Run up to 5 requests at a time
  .rateLimit({ perSecond: 10 }) // No more than 10 requests per second
  .map(async (url) => {
    const response = await api.get(url);
    return response.data;
  })
  .run();

console.log('Successfully fetched users:', results);
console.log('Failed URLs:', errors);

In that one elegant chain, we've defined a process that is parallel, resource-friendly, and resilient to failure. This is the fix I've been searching for.

Doddle's Core Features: Power Meets Simplicity

Doddle isn't packed with a million features. It's focused on doing one thing exceptionally well: async iteration. Here are the pillars that make it so effective.

Controlled Concurrency

This is the most critical feature. You can precisely define how many tasks should be running at any given moment. This prevents you from overwhelming services or your own event loop.

// Process up to 3 files at a time
await doddle(filePaths)
  .concurrency(3)
  .forEach(async (path) => {
    await processFile(path);
  })
  .run();

Graceful Error Handling

Unlike `Promise.all`, Doddle doesn't bail on the first error. It continues processing the other items and collects all errors for you to inspect later. The final result is an object containing both the successful `results` and the `errors`.

// One URL is bad, but the others will still be fetched
const urls = ['good.com', 'bad.com', 'another-good.com'];

const { results, errors } = await doddle(urls)
  .map(async (url) => fetch(url))
  .run();

// results will contain 2 successful responses
// errors will contain 1 error object for 'bad.com'

Effortless Rate-Limiting

Many APIs enforce strict rate limits. Implementing this manually with `setTimeout` and complex logic is messy. With Doddle, it's a single line. It ensures your operations stay within the defined limits, whether it's per second, per minute, or over any interval.

// Only make 50 API calls per minute
await doddle(userIDs)
  .rateLimit({ count: 50, per: 60 * 1000 }) // 50 per minute
  .concurrency(10)
  .map(id => api.fetchUser(id))
  .run();

Doddle vs. The Alternatives: A Head-to-Head Comparison

To see where Doddle fits in, let's compare it directly with native JavaScript methods and other popular patterns.

Async Iteration Method Comparison
Feature`for...await...of``Promise.all``p-map` (Library)Doddle
API StyleImperative LoopDeclarativeDeclarativeFluent / Chained
ConcurrencyNo (Sequential)UnlimitedYes (Fixed)Yes (Fixed)
Rate LimitingNoNoNoYes (Built-in)
Error HandlingStops on throwFails entire batchFails entire batchContinues & Collects
Ease of UseVery HighHighMediumVery High

Practical Use Case: Processing 10,000 API Records

Let's solidify this with a real-world scenario. We need to fetch 10,000 user records from an API that is paginated and has a rate limit of 100 requests per minute.

The Old, Clunky Way

Without Doddle, you might write something like this, involving manual chunking, delays, and complex error aggregation. It's verbose and hard to reason about.

async function processAllUsers() {
  const allUsers = [];
  const allErrors = [];
  const userIds = Array.from({ length: 10000 }, (_, i) => i + 1);
  const chunkSize = 10; // To control concurrency

  for (let i = 0; i < userIds.length; i += chunkSize) {
    const chunk = userIds.slice(i, i + chunkSize);
    const promises = chunk.map(id => api.getUser(id).catch(e => e));
    
    const results = await Promise.all(promises);
    
    results.forEach(res => {
      if (res instanceof Error) allErrors.push(res);
      else allUsers.push(res);
    });

    // Manual rate-limiting delay
    await new Promise(resolve => setTimeout(resolve, 1000)); 
  }
  return { allUsers, allErrors };
}

The New, Doddle Way

With Doddle, the logic becomes declarative, clean, and immensely more readable. All the complexity is abstracted away.

import doddle from 'doddle';

async function processAllUsersWithDoddle() {
  const userIds = Array.from({ length: 10000 }, (_, i) => i + 1);

  const { results, errors } = await doddle(userIds)
    // Max 100 requests per 60 seconds
    .rateLimit({ count: 100, per: 60 * 1000 })
    // Run 10 requests in parallel to maximize throughput within the rate limit
    .concurrency(10)
    // The mapping function to fetch each user
    .map(id => api.getUser(id))
    // Execute the chain
    .run();

  return { allUsers: results, allErrors: errors };
}

The difference is night and day. The intent is clear, the code is minimal, and the process is far more robust.

Conclusion: Why Doddle is My Go-To for Async Work

Building Doddle wasn't just an academic exercise; it was a solution to a persistent problem that slowed down development and introduced fragility into my applications. By focusing on a clean, fluent API that elegantly solves concurrency, rate limiting, and error handling, it has become an indispensable part of my toolkit.

In 2025, we shouldn't be writing boilerplate to handle fundamental async patterns. We need tools that let us focus on our application's business logic, not on the plumbing. For me, Doddle is that tool. It provides the control you need without sacrificing the simplicity you want, making async iteration a true doddle.