React Development

I Tested a 100-Component React App: 3 Lessons for 2025

I pushed a React app to 100 components and uncovered 3 crucial lessons for 2025. Learn about strategic memoization, modern state management, and code splitting.

D

Daniel Avery

Principal Frontend Engineer specializing in large-scale React applications and performance optimization.

7 min read3 views

The Challenge: When a React App Hits 100 Components

It starts subtly. A little lag on a button click. A loading spinner that lingers a second too long. Before you know it, your "state-of-the-art" React dashboard, now boasting over 100 unique components, is groaning under its own weight. This was our reality. We intentionally built a complex data visualization tool to stress-test modern React practices and see where they break.

The goal was to simulate a real-world, mature application—the kind many of us will be maintaining or building in 2025. The initial build used a standard, sensible stack: Vite, React 18, TypeScript, and a single global state manager for simplicity. But as the component tree deepened and inter-dependencies grew, performance bottlenecks emerged that threatened the entire user experience. After weeks of profiling, refactoring, and testing, we distilled our findings into three foundational lessons that every React developer should carry into 2025.

Lesson 1: Aggressive Memoization is a Trap

When faced with sluggish re-renders, the developer community's first instinct is often to reach for React.memo, useMemo, and useCallback. The thinking is simple: prevent re-renders, and performance will improve. While well-intentioned, our 100-component experiment proved that wrapping everything in a memoization hook is a dangerous anti-pattern.

The Hidden Cost of Over-Memoization

Memoization isn't free. Every time you use useMemo or useCallback, you introduce two costs:

  1. Memory Overhead: React has to store the cached result and its dependencies in memory. Across 100+ components, this adds up, potentially increasing the memory footprint of your application.
  2. Comparison Cost: On every render, React must compare the new dependency array with the previous one to decide whether to return the cached value or re-compute. For simple primitives, this is trivial. For complex objects or arrays, this comparison can sometimes be as expensive as the re-render you were trying to avoid.

We found components wrapped in React.memo that were still re-rendering because they were being passed new function or object references from their parents on every render. The solution wasn't more memoization; it was smarter architecture.

Strategic Memoization in Practice

Instead of aggressive memoization, adopt a strategic approach. Profile your application first using the React Developer Tools Profiler to identify the actual performance hogs. Apply memoization only when:

  • A component is computationally expensive: It performs complex calculations or renders a large, intricate DOM tree (e.g., a data grid with 1000 rows).
  • A component re-renders frequently with the same props: This often happens with pure, presentational components deep within the component tree.
  • Props stability is crucial: When passing a function or object to a child component that is itself memoized or has a useEffect hook depending on that prop. Here, useCallback and useMemo are essential to prevent unnecessary child re-renders or effect triggers.

The takeaway: Don't memoize by default. Profile, identify bottlenecks, and use memoization as a precision tool, not a sledgehammer.

Lesson 2: State Management Architecture is Your Bedrock

Our initial approach was to put everything in a single Zustand store. It was simple and it worked—until it didn't. With 100 components, many of which subscribed to slices of the global store, a single state change could trigger a cascade of re-renders across unrelated parts of the application. The problem wasn't the tool (Zustand is fantastic), but our monolithic approach to state.

The Limits of a Single Global Store

Placing all state—server cache, global UI state, and local component state—into one bucket creates tight coupling and performance issues. A change to a single piece of server data could cause a component managing a modal's open/close state to re-evaluate, even if it didn't use that data. This becomes unmanageable at scale.

The State Hierarchy Model: A Modern Approach

The key to scalable state management in 2025 is a hierarchical, multi-layered approach. Co-locate state with the components that need it, and choose the right tool for each type of state.

  • Server Cache State (e.g., React Query, SWR): All state that comes from your server belongs here. These libraries are purpose-built for data fetching, caching, re-validation, and optimistic updates. They handle the complexity of async data better than any global store.
  • Global UI State (e.g., Zustand, Jotai): This is for the small subset of state that is truly global and doesn't come from a server. Think user authentication status, theme (dark/light mode), or a shopping cart. Keep this store lean.
  • Local Component State (useState, useReducer): The vast majority of your UI state should live here. Is a dropdown open? Is an input field in an error state? Is a form being submitted? This state belongs inside the component that uses it.

By segmenting state this way, you dramatically reduce unnecessary re-renders. A change in server data only affects components using that specific query, and a local UI change is contained entirely within its own component.

State Management Approach Comparison
ApproachBest ForComplexityPerformance Impact
Local State (useState)Component-specific UI (e.g., form inputs, toggles)LowMinimal & Contained
Prop DrillingSimple state sharing in shallow trees (2-3 levels)LowCan cause unnecessary re-renders in deep trees
Context APILow-frequency updates (e.g., theme, user)MediumHigh (re-renders all consumers on any change)
Global Store (Zustand)Truly global, frequently updated UI stateMediumOptimized, but can be slow if overused
Server Cache (React Query)All server-side data (fetching, caching, mutations)HighExcellent (granular, automatic management)

Lesson 3: Component-Level Code Splitting is Non-Negotiable

We implemented route-based code splitting from the start, a standard practice for any non-trivial app. However, our main dashboard page still required a massive initial JavaScript bundle. This page contained dozens of components: charts, tables, filters, modals, and info panels. Even though a user might only interact with a few of them, their code was downloaded and parsed upfront, leading to a slow Time to Interactive (TTI).

Why Route-Based Splitting Isn't Enough

In a component-heavy application like a dashboard, the main bundle for a single route can easily grow to several megabytes. The solution is to get more granular. Don't just split code by page; split it by component.

Implementing Granular Lazy Loading

React's built-in React.lazy() and Suspense make this incredibly powerful and easy to implement. Identify components that are:

  • Below the fold: Content the user has to scroll to see.
  • Triggered by user interaction: Modals, popovers, or complex forms that only appear after a click.
  • Heavy but non-critical: Large charting libraries or complex data visualizations that can load a moment after the main content.

By wrapping these components in React.lazy, you defer loading their code until they are actually needed. For our 100-component app, this was a game-changer. We reduced the initial bundle size for our main dashboard by over 60%, drastically improving the perceived load time.

A simple implementation looks like this:

import React, { Suspense } from 'react';

const HeavyChartComponent = React.lazy(() => import('./HeavyChartComponent'));

function Dashboard() {
  return (
    

My Dashboard

{/* Other critical components load immediately */} Loading chart...
}>
); }

This simple pattern ensures your user gets a responsive UI immediately, while heavier, non-essential parts of the application load gracefully in the background.

Conclusion: Building Resilient React Apps for 2025

Pushing a React application to 100 components was a powerful lesson in architectural discipline. The popular techniques that work for small apps begin to crack under the pressure of scale. As we move into 2025, building performant, maintainable React apps requires a shift in mindset. It's less about clever tricks and more about solid foundations. By embracing strategic memoization, a hierarchical state model, and granular code splitting, you're not just fixing today's performance issues—you're building a resilient architecture that can scale to 200 components and beyond.

Tags