Boosting API Efficiency with Laravel HTTP Client: Request Batching Explained

In this post, we’ll dive into what request batching is in Laravel, how it works, when to use it, and best practices — plus code examples you can drop into your projects today.

Boosting API Efficiency with Laravel HTTP Client: Request Batching Explained Image

As web developers, we often deal with making multiple HTTP calls in our applications — whether to microservices, third-party APIs, or internal endpoints. Sending them one by one can feel slow and inefficient. What if we could batch them, track progress, and handle failures gracefully? That’s where Laravel's request batching comes in.

What is Request Batching?

Laravel's HTTP client (a wrapper around Guzzle) supports concurrent requests using two approaches: pools and batches.

  • Pools let you issue multiple requests concurrently, essentially “fire and collect,” but with limited hooks for tracking progress or orchestrating callbacks.
  • Batches build upon the same idea of concurrency but add lifecycle hooks — before, progress, then, catch, finally — giving you much finer control over how to respond as the batch executes.

In short: batching = concurrency + orchestration.


Anatomy of a Batch Request

Let’s break down how you set up and use a batch in Laravel:

 
use Illuminate\Http\Client\Batch;
use Illuminate\Support\Facades\Http;
use Illuminate\Http\Client\RequestException;
use Illuminate\Http\Client\Response;

$responses = Http::batch(function (Batch $batch) {
    $batch->get('https://api.example.com/users');
    $batch->get('https://api.example.com/posts');
    $batch->get('https://api.example.com/comments');
})
->before(function (Batch $batch) {
    // Called just after the batch is created but before any requests run
})
->progress(function (Batch $batch, int|string $key, Response $response) {
    // Called each time an individual request finishes successfully
})
->then(function (Batch $batch, array $results) {
    // Called if all requests succeed
})
->catch(function (Batch $batch, int|string $key, Response|RequestException $response) {
    // Called when the first request fails
})
->finally(function (Batch $batch, array $results) {
    // Called after all requests finish (success or failure)
})
->send();

Key Concepts & Properties

  • You can name requests using as(‘alias’) to access responses by key.
  • After sending the batch (->send()), you cannot add more requests (else it throws a BatchInProgressException).
  • On the Batch instance, you have useful state information:
    • $batch->totalRequests — total number of requests in that batch
    • $batch->pendingRequests — how many are yet to finish
    • $batch->failedRequests — count of failed ones
    • $batch->finished() — whether the batch is done
    • $batch->hasFailures() — whether any failed

These let you introspect the batch’s progress and health.


When to Use Batching (vs Pooling or Sequential Calls)

Here are scenarios where batching shines:

ScenarioWhy Batching Helps
You’re hitting multiple endpoints that are independentYou get concurrency + orchestration (hooks)
You need to track progress (e.g. update a UI, logging)The progress callback gives per-request updates
You want unified error handlingCatch the first failure, or handle in finally
You want to name & manage responses more cleanlyUsing as('alias') gives you named responses

That said, batching isn’t always the right tool:

  • When requests must run in strict sequence (e.g. one depends on result of previous)
  • When the number of parallel HTTP calls is huge and risks overwhelming the target API
  • In simpler use-cases where a pool (with less overhead) suffices

Real-World Example: Fetching Data from Multiple Services

Suppose your app needs to gather data from three microservices — user profiles, transactions, and notifications — and then render a dashboard once everything’s in.

 
$responses = Http::batch(function (Batch $batch) {
    $batch->as('users')->get('https://api.myapp.com/users');
    $batch->as('transactions')->get('https://api.myapp.com/transactions');
    $batch->as('notifications')->get('https://api.myapp.com/notifications');
})
->progress(function (Batch $batch, $key, $response) {
    // We could log progress or push to websocket
    logger("Batch request [$key] finished with status {$response->status()}");
})
->then(function (Batch $batch, $results) {
    $users = $results['users'];
    $transactions = $results['transactions'];
    $notifications = $results['notifications'];
    // Merge / process / return combined payload
})
->catch(function (Batch $batch, $key, $responseOrException) {
    // One of them failed — maybe fallback or retry
    logger("Batch failed at request [$key]");
})
->finally(function (Batch $batch, $results) {
    // Cleanup, metrics, notify that batch is done
})
->send();

This pattern gives you both concurrency and full control over what happens at each stage.


Tips, Gotchas & Best Practices

  1. Limit concurrency wisely
    Don’t flood the underlying API or your network. Consider throttling or breaking into smaller batches if you have many calls.
  2. Use meaningful aliases
    Naming requests helps readability (especially in then / catch).
  3. Monitor failures early
    Don’t wait till everything fails; use catch and finally wisely.
  4. Don’t mutate after send
    Once ->send() is called, you can’t attach more requests.
  5. Fallback strategies
    In catch, you might retry specific failed requests or fallback to cached data.
  6. Testing & faking
    Use Http::fake() to simulate batch responses in tests. (Laravel’s HTTP faking also covers batch behavior.) 

     

    See the Official Laravel documentation: https://laravel.com/docs/12.x/http-client#request-batching


Summary

Laravel’s request batching is a powerful, expressive tool for making multiple HTTP requests concurrently — with the added benefits of hooks, progress monitoring, and robust error handling. If your app interacts with multiple services at once, batching is a cleaner, more controlled way to orchestrate those calls over naive sequential loops.

Tags

Do you Like?