In this post, we’ll dive into what request batching is in Laravel, how it works, when to use it, and best practices — plus code examples you can drop into your projects today.

As web developers, we often deal with making multiple HTTP calls in our applications — whether to microservices, third-party APIs, or internal endpoints. Sending them one by one can feel slow and inefficient. What if we could batch them, track progress, and handle failures gracefully? That’s where Laravel's request batching comes in.
What is Request Batching?
Laravel's HTTP client (a wrapper around Guzzle) supports concurrent requests using two approaches: pools and batches.
- Pools let you issue multiple requests concurrently, essentially “fire and collect,” but with limited hooks for tracking progress or orchestrating callbacks.
- Batches build upon the same idea of concurrency but add lifecycle hooks — before, progress, then, catch, finally — giving you much finer control over how to respond as the batch executes.
In short: batching = concurrency + orchestration.
Anatomy of a Batch Request
Let’s break down how you set up and use a batch in Laravel:
Key Concepts & Properties
- You can name requests using
as(‘alias’)
to access responses by key. - After sending the batch (
->send()
), you cannot add more requests (else it throws aBatchInProgressException
). - On the
Batch
instance, you have useful state information:$batch->totalRequests
— total number of requests in that batch$batch->pendingRequests
— how many are yet to finish$batch->failedRequests
— count of failed ones$batch->finished()
— whether the batch is done$batch->hasFailures()
— whether any failed
These let you introspect the batch’s progress and health.
When to Use Batching (vs Pooling or Sequential Calls)
Here are scenarios where batching shines:
Scenario | Why Batching Helps |
---|---|
You’re hitting multiple endpoints that are independent | You get concurrency + orchestration (hooks) |
You need to track progress (e.g. update a UI, logging) | The progress callback gives per-request updates |
You want unified error handling | Catch the first failure, or handle in finally |
You want to name & manage responses more cleanly | Using as('alias') gives you named responses |
That said, batching isn’t always the right tool:
- When requests must run in strict sequence (e.g. one depends on result of previous)
- When the number of parallel HTTP calls is huge and risks overwhelming the target API
- In simpler use-cases where a pool (with less overhead) suffices
Real-World Example: Fetching Data from Multiple Services
Suppose your app needs to gather data from three microservices — user profiles, transactions, and notifications — and then render a dashboard once everything’s in.
This pattern gives you both concurrency and full control over what happens at each stage.
Tips, Gotchas & Best Practices
- Limit concurrency wisely
Don’t flood the underlying API or your network. Consider throttling or breaking into smaller batches if you have many calls. - Use meaningful aliases
Naming requests helps readability (especially in then / catch). - Monitor failures early
Don’t wait till everything fails; usecatch
andfinally
wisely. - Don’t mutate after send
Once->send()
is called, you can’t attach more requests. - Fallback strategies
Incatch
, you might retry specific failed requests or fallback to cached data. Testing & faking
UseHttp::fake()
to simulate batch responses in tests. (Laravel’s HTTP faking also covers batch behavior.)See the Official Laravel documentation: https://laravel.com/docs/12.x/http-client#request-batching
Summary
Laravel’s request batching is a powerful, expressive tool for making multiple HTTP requests concurrently — with the added benefits of hooks, progress monitoring, and robust error handling. If your app interacts with multiple services at once, batching is a cleaner, more controlled way to orchestrate those calls over naive sequential loops.