Master Laravel Bulk Inserts & Get Your Models Back (2025)
Tired of slow database inserts? Learn how to perform high-performance Laravel bulk inserts and efficiently retrieve your Eloquent models back in 2025.
Alex Petrov
Senior PHP & Laravel developer specializing in application performance and database optimization.
Introduction: The High Cost of Slow Inserts
As a Laravel developer, you've inevitably faced this scenario: you have a large array of data—maybe from a CSV import, an API response, or a complex calculation—and you need to persist it to your database. The naive approach is simple: loop through the array and create a new Eloquent model for each item. It works, but as the dataset grows from a hundred to thousands of records, your application grinds to a halt. You're hit with script timeouts, memory limits, and a frustrated user staring at a loading spinner.
This performance bottleneck is caused by treating a bulk operation as a series of individual ones. Each `Model::create()` call is a separate, expensive conversation with your database.
In this comprehensive 2025 guide, we'll dissect this problem and show you how to use Laravel's powerful tools to perform lightning-fast bulk inserts. More importantly, we'll solve the common follow-up question that stumps many developers: how do you get your fully-formed Eloquent models back after a bulk insert? Let's dive in.
Why Single Inserts in a Loop are a Performance Killer
Before we jump to the solution, it's crucial to understand why the common loop-based approach is so inefficient. This is often called the "N+1 Insert Problem," a cousin to the infamous "N+1 Query Problem."
// The slow way
$data = [ /* 10,000 items */ ];
foreach ($data as $item) {
User::create([
'name' => $item['name'],
'email' => $item['email'],
'password' => Hash::make('password'),
]);
}
Understanding the Overhead
When you run the code above for 10,000 items, you are not executing one operation. You are executing 10,000 separate operations. Each `create()` call involves:
- Network Latency: A round trip from your application server to your database server. Even on the same machine, this has overhead.
- Database Transaction: By default, each `create()` can be its own transaction, adding overhead for commits.
- Query Parsing: The database has to parse and plan the execution of 10,000 individual `INSERT` statements.
- Eloquent Events: For every single model, Eloquent fires events like `creating`, `created`, `saving`, and `saved`. While powerful, this adds significant overhead in a bulk scenario.
Imagine going to the grocery store for 100 items, but instead of putting them all in one cart, you walk in, grab one item, pay for it, walk out, and repeat the process 99 more times. That's the inefficiency we're dealing with.
The Standard Approach: Laravel's `DB::insert()`
Laravel provides a much faster, native way to handle this: using the Query Builder's `insert` method. This method builds a single, massive `INSERT` statement with all your data.
// The fast way
$usersToInsert = [];
$now = now();
foreach ($data as $item) {
$usersToInsert[] = [
'name' => $item['name'],
'email' => $item['email'],
'password' => Hash::make('password'),
'created_at' => $now,
'updated_at' => $now,
];
}
User::insert($usersToInsert); // Or DB::table('users')->insert($usersToInsert);
This is orders of magnitude faster. A process that took minutes now takes seconds. Why? Because it's just one trip to the database. One query to parse, one transaction to handle.
But there's a huge catch. The `insert()` method returns a boolean: `true` on success, `false` on failure. It gives you no information about the records you just created. You don't get their IDs, and you certainly don't get back a collection of Eloquent models. So, what if you need to associate these new users with another model, or log their creation with their new IDs?
The Holy Grail: Bulk Inserting AND Getting Models Back
This is where we get clever. We can combine the raw speed of `insert()` with a smart, secondary query to retrieve the exact models we just created. The right strategy depends on your primary key type: UUIDs or auto-incrementing integers.
Strategy 1: The UUID Advantage
If you're using UUIDs (or ULIDs) for your primary keys, the solution is clean, elegant, and robust. Since you generate the primary keys *before* the insert, you already know what they are.
use Illuminate\Support\Str;
$usersToInsert = [];
$uuids = [];
$now = now();
foreach ($data as $item) {
$uuid = (string) Str::uuid();
$uuids[] = $uuid;
$usersToInsert[] = [
'id' => $uuid, // We set the primary key ourselves
'name' => $item['name'],
'email' => $item['email'],
'password' => Hash::make('password'),
'created_at' => $now,
'updated_at' => $now,
];
}
// Step 1: Perform the fast bulk insert
User::insert($usersToInsert);
// Step 2: Retrieve the newly created models in a single query
$newlyCreatedUsers = User::whereIn('id', $uuids)->get();
// You now have a collection of Eloquent models!
This is the recommended approach for modern applications. It's safe from concurrency issues and completely reliable.
Strategy 2: The Auto-Increment Gamble (Use With Caution!)
What if you're stuck with traditional auto-incrementing IDs? We can still get the models back, but it comes with an important caveat.
The strategy relies on the fact that auto-incrementing IDs are typically assigned in a contiguous block during a single `INSERT` statement. We can get the ID of the *first* inserted record and then calculate the range of all new IDs.
use Illuminate\Support\Facades\DB;
// Prepare data (without IDs this time)
$usersToInsert = [/* ... same as before, but no 'id' field ... */];
// We need to use the DB facade to get the PDO instance later
DB::beginTransaction();
try {
// Step 1: Get the ID that the *next* insert will have.
// Note: This can vary by database driver.
// For MySQL:
$statement = DB::select("show table status like 'users'");
$firstInsertId = $statement[0]->Auto_increment;
// Step 2: Perform the bulk insert
User::insert($usersToInsert);
// Step 3: Calculate the range of inserted IDs
$count = count($usersToInsert);
$lastInsertId = $firstInsertId + $count - 1;
// Step 4: Retrieve the models
$newlyCreatedUsers = User::whereBetween('id', [$firstInsertId, $lastInsertId])->get();
DB::commit();
} catch (\Exception $e) {
DB::rollBack();
// Handle exception
}
CRITICAL CAVEAT: This method is only safe in a low-concurrency environment. If another request or process inserts a row into the `users` table between your `show table status` query and your `insert()` call, your starting ID will be wrong, and you will retrieve an incorrect set of models. For high-traffic applications, this method is risky. Using table locks can mitigate this, but UUIDs are a far superior solution.
Method | Performance | Returns Models? | Best Use Case |
---|---|---|---|
foreach loop with Model::create() | Poor | Yes (one by one) | Very small datasets (< 20 records) or when Eloquent events are required for each model. |
DB::insert() or Model::insert() | Excellent | No (returns boolean) | Fire-and-forget bulk inserts where you don't need the created models back immediately. |
insert() + whereIn('uuid', ...) | Excellent | Yes (in a second query) | The recommended modern approach. Perfect for any size dataset when using UUIDs. |
insert() + whereBetween('id', ...) | Excellent | Yes (with caveats) | For auto-incrementing IDs in low-concurrency systems or within a locked transaction. |
Encapsulating Logic: Building a Reusable Bulk Insert Service
To keep your controllers clean and your logic reusable, it's a great practice to extract this functionality into a dedicated service class. This makes your code more testable and maintainable.
The Service Class
Let's create a simple service that handles the UUID-based approach.
app/Services/BulkOperationService.php
<?php
namespace App\Services;
use Illuminate\Database\Eloquent\Model;
use Illuminate\Support\Collection;
use Illuminate\Support\Str;
class BulkOperationService
{
/**
* Performs a bulk insert using UUIDs and returns the created models.
*
* @param string $modelClass The Eloquent model class name.
* @param array $data The array of data to insert.
* @return Collection
*/
public function insertWithUuidsAndRetrieve(string $modelClass, array $data): Collection
{
if (empty($data)) {
return new Collection();
}
$model = new $modelClass();
$keyName = $model->getKeyName();
$now = now();
$uuids = [];
$recordsToInsert = [];
foreach ($data as $record) {
$uuid = (string) Str::uuid();
$uuids[] = $uuid;
$recordsToInsert[] = array_merge($record, [
$keyName => $uuid,
'created_at' => $now,
'updated_at' => $now,
]);
}
$modelClass::insert($recordsToInsert);
return $modelClass::whereIn($keyName, $uuids)->get();
}
}
Using the Service
Now, in your controller or command, you can inject this service and use it cleanly.
use App\Services\BulkOperationService;
use App\Models\Product;
class ProductImportController
{
public function store(BulkOperationService $bulkService)
{
$data = [ /* array of product data without IDs */ ];
$newProducts = $bulkService->insertWithUuidsAndRetrieve(Product::class, $data);
// $newProducts is now a collection of your new Product models
// You can now dispatch jobs, create relationships, etc.
return response()->json(['message' => 'Products imported successfully!']);
}
}
A Quick Note on `upsert()`
Since Laravel 8, the `upsert()` method has been available. It's designed to perform a bulk "insert or update" operation. It's incredibly efficient for synchronizing data.
However, it's important to note that like `insert()`, `upsert()` does not return the models. It returns an integer representing the number of affected rows. While it's a powerful tool for bulk data management, it doesn't solve the specific problem of retrieving the models you just created.