Copied!
Programming
Laravel
PHP

Laravel Queue Real-World Use Cases – Bulk Email, PDF, Image Processing & Webhooks

Laravel Queue Real-World Use Cases – Bulk Email, PDF, Image Processing & Webhooks
Shahroz Javed
Mar 18, 2026 . 54 views

Bulk Email Sending with Rate Limiting

Sending a newsletter to 50,000 subscribers is a classic queue use case. The wrong approach is looping and sending emails in a single request or a single job. The right approach is one job per recipient, with rate limiting to respect your mail provider's limits.

The Wrong Way (Don't Do This)

// ❌ Never do this in a controller
public function sendNewsletter(Newsletter $newsletter)
{
    $users = User::subscribed()->get();  // could be 50,000 rows!

    foreach ($users as $user) {
        Mail::to($user)->send(new NewsletterMail($newsletter));  // blocks the request
    }
}

The Right Way — Batch of Individual Jobs

Create a dispatcher job that chunks the recipients and dispatches individual send jobs. This splits the work into small, independently retryable units:

// App\Jobs\DispatchNewsletter.php
class DispatchNewsletter implements ShouldQueue
{
    use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    public function __construct(protected Newsletter $newsletter) {}

    public function handle(): void
    {
        // Chunk to avoid loading all 50k users into memory at once
        User::subscribed()
            ->select(['id', 'email', 'name'])
            ->chunk(200, function ($users) {
                $jobs = $users->map(fn ($user) =>
                    new SendNewsletterToUser($this->newsletter, $user)
                )->all();

                Bus::batch($jobs)
                    ->name("Newsletter #{$this->newsletter->id}")
                    ->onQueue('newsletters')
                    ->allowFailures()
                    ->dispatch();
            });
    }
}

// App\Jobs\SendNewsletterToUser.php
class SendNewsletterToUser implements ShouldQueue
{
    use Batchable, Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    public int $tries = 3;
    public array $backoff = [30, 120, 300];

    // Rate limit: respect mail provider API limits
    public function middleware(): array
    {
        return [new RateLimited('newsletter-emails')];
    }

    public function __construct(
        protected Newsletter $newsletter,
        protected User $user
    ) {}

    public function handle(): void
    {
        if ($this->batch()?->cancelled()) {
            return;
        }

        // Check if user unsubscribed while the job was queued
        if (! $this->user->isSubscribed()) {
            return;
        }

        Mail::to($this->user)->send(new NewsletterMail($this->newsletter));
    }
}

// Define rate limiter in AppServiceProvider::boot()
RateLimiter::for('newsletter-emails', function () {
    return Limit::perMinute(200);  // Mailgun free = 300/min, leave buffer
});

Track Progress from the Controller

// Controller — dispatch and store batch ID
public function sendNewsletter(Newsletter $newsletter)
{
    $batch = Bus::batch([new DispatchNewsletter($newsletter)])
        ->name("Newsletter #{$newsletter->id}")
        ->dispatch();

    $newsletter->update(['batch_id' => $batch->id]);

    return response()->json(['batch_id' => $batch->id, 'status' => 'queued']);
}

// Status endpoint
public function status(Newsletter $newsletter)
{
    $batch = Bus::findBatch($newsletter->batch_id);

    return response()->json([
        'progress'  => $batch->progress(),
        'processed' => $batch->processedJobs(),
        'failed'    => $batch->failedJobs,
        'finished'  => $batch->finished(),
    ]);
}

PDF Generation & Storage

PDF generation is CPU-heavy. A complex invoice with charts and tables can take 2–5 seconds. Never do it in a request cycle. Queue it, store the result, and notify the user when it's ready.

// App\Jobs\GenerateInvoicePdf.php
class GenerateInvoicePdf implements ShouldQueue
{
    use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    public int $timeout = 120;
    public int $tries = 3;
    public bool $deleteWhenMissingModels = true;

    public function __construct(protected Invoice $invoice) {}

    public function handle(): void
    {
        // Using Spatie Laravel PDF or Dompdf
        $pdf = Pdf::view('invoices.template', [
            'invoice'     => $this->invoice->load('items', 'customer'),
            'company'     => config('company'),
        ])->format('a4');

        // Store on S3 (or local disk)
        $path = "invoices/{$this->invoice->id}/invoice-{$this->invoice->number}.pdf";
        Storage::disk('s3')->put($path, $pdf->toString());

        // Update the invoice record with the download path
        $this->invoice->update([
            'pdf_path'       => $path,
            'pdf_generated_at' => now(),
            'status'         => 'ready',
        ]);

        // Notify the user the PDF is ready
        $this->invoice->customer->notify(new InvoicePdfReady($this->invoice));
    }

    public function failed(\Throwable $e): void
    {
        $this->invoice->update(['status' => 'pdf_failed']);

        \Log::error("PDF generation failed for invoice #{$this->invoice->id}: " . $e->getMessage());
    }
}

// In your controller:
public function generate(Invoice $invoice)
{
    $invoice->update(['status' => 'generating']);
    GenerateInvoicePdf::dispatch($invoice)->onQueue('pdf');

    return response()->json(['message' => 'PDF generation started. You will be notified when ready.']);
}
⚠️ Always store generated files on a shared disk (S3, shared NFS) — not local storage. If you have multiple workers on multiple servers, a file stored locally on Worker A won't be accessible from Worker B or your web server.

Image Processing & Thumbnails

When a user uploads a profile photo or product image, you need multiple sizes. Generate them asynchronously so the upload response is instant.

// App\Jobs\ProcessUploadedImage.php
use Intervention\Image\Facades\Image;

class ProcessUploadedImage implements ShouldQueue
{
    use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    public int $timeout = 60;

    // Image sizes to generate
    private array $sizes = [
        'thumb'  => [150, 150],
        'medium' => [400, 400],
        'large'  => [1200, 1200],
    ];

    public function __construct(
        protected string $originalPath,
        protected int    $userId
    ) {}

    public function handle(): void
    {
        $original = Storage::disk('s3')->get($this->originalPath);
        $extension = pathinfo($this->originalPath, PATHINFO_EXTENSION);
        $basePath = dirname($this->originalPath);

        foreach ($this->sizes as $name => [$width, $height]) {
            $resized = Image::make($original)
                ->fit($width, $height, function ($constraint) {
                    $constraint->upsize();  // don't upscale small images
                })
                ->encode($extension, 85);  // 85% quality

            $sizePath = "{$basePath}/{$name}.{$extension}";
            Storage::disk('s3')->put($sizePath, $resized->getEncoded());
        }

        // Update user's avatar paths
        User::find($this->userId)?->update([
            'avatar_thumb'  => "{$basePath}/thumb.{$extension}",
            'avatar_medium' => "{$basePath}/medium.{$extension}",
            'avatar_large'  => "{$basePath}/large.{$extension}",
        ]);
    }
}

// Controller — upload and queue
public function uploadAvatar(Request $request)
{
    $request->validate(['avatar' => 'required|image|max:5120']);

    $path = $request->file('avatar')->store(
        "avatars/{$request->user()->id}/original",
        's3'
    );

    ProcessUploadedImage::dispatch($path, $request->user()->id)
        ->onQueue('media');

    return response()->json(['message' => 'Avatar uploaded. Processing in background.']);
}

Third-Party API Sync

Syncing data to external services (CRMs, ERPs, analytics, shipping providers) is one of the most common production queue use cases. The key is making it resilient to API failures with proper backoff, deduplication, and idempotent writes.

// App\Jobs\SyncCustomerToHubspot.php
class SyncCustomerToHubspot implements ShouldQueue, ShouldBeUnique
{
    use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    public int $tries = 5;
    public array $backoff = [30, 60, 120, 300, 600]; // exponential

    public function __construct(protected Customer $customer) {}

    // Per-customer uniqueness — avoid redundant syncs for same customer
    public function uniqueId(): string
    {
        return 'hubspot-sync-' . $this->customer->id;
    }

    public function middleware(): array
    {
        return [
            new RateLimited('hubspot-api'),       // respect API rate limits
            new ThrottleExceptions(3, 5),          // back off when API is down
        ];
    }

    public function handle(HubspotService $hubspot): void
    {
        // Re-fetch fresh data — user may have updated since dispatch
        $customer = $this->customer->fresh();

        if (! $customer) return;  // deleted since dispatch

        if ($customer->hubspot_id) {
            // Update existing contact
            $hubspot->updateContact($customer->hubspot_id, $this->buildPayload($customer));
        } else {
            // Create new contact and store the Hubspot ID
            $hubspotId = $hubspot->createContact($this->buildPayload($customer));
            $customer->update(['hubspot_id' => $hubspotId]);
        }

        $customer->update(['hubspot_synced_at' => now()]);
    }

    private function buildPayload(Customer $customer): array
    {
        return [
            'email'     => $customer->email,
            'firstname' => $customer->first_name,
            'lastname'  => $customer->last_name,
            'phone'     => $customer->phone,
            'company'   => $customer->company?->name,
        ];
    }

    public function failed(\Throwable $e): void
    {
        \Log::error("Hubspot sync failed for customer #{$this->customer->id}: " . $e->getMessage());
    }
}

// Trigger from a model observer — automatically syncs on any update
class CustomerObserver
{
    public function updated(Customer $customer): void
    {
        // Only sync if relevant fields changed
        if ($customer->isDirty(['email', 'first_name', 'last_name', 'phone'])) {
            SyncCustomerToHubspot::dispatch($customer)->onQueue('sync');
        }
    }
}

Idempotent Webhook Processing

Webhooks from payment providers (Stripe, PayPal) can be delivered more than once. Your queue job must be idempotent — running it multiple times must produce the same result.

// routes/api.php — receive webhook, verify, then queue
Route::post('/webhooks/stripe', function (Request $request) {
    // Verify signature BEFORE queuing — reject invalid webhooks immediately
    $payload = $request->getContent();
    $signature = $request->header('Stripe-Signature');

    try {
        $event = \Stripe\Webhook::constructEvent(
            $payload,
            $signature,
            config('services.stripe.webhook_secret')
        );
    } catch (\Exception $e) {
        return response('Invalid signature', 400);
    }

    // Store the raw webhook for audit trail + idempotency check
    $webhook = WebhookLog::create([
        'provider'   => 'stripe',
        'event_id'   => $event->id,       // Stripe guarantees unique event IDs
        'event_type' => $event->type,
        'payload'    => $payload,
        'status'     => 'pending',
    ]);

    ProcessStripeWebhook::dispatch($webhook)->onQueue('webhooks');

    return response('OK', 200);  // Respond fast — Stripe times out at 30s
});

// App\Jobs\ProcessStripeWebhook.php
class ProcessStripeWebhook implements ShouldQueue
{
    use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    public int $tries = 5;
    public array $backoff = [10, 30, 60, 120, 300];

    public function __construct(protected WebhookLog $webhook) {}

    public function handle(): void
    {
        // Idempotency check — was this event already processed?
        if ($this->webhook->status === 'processed') {
            return;  // already handled, skip silently
        }

        $payload = json_decode($this->webhook->payload, true);
        $event   = $payload['type'];
        $data    = $payload['data']['object'];

        // Route to the appropriate handler
        match ($event) {
            'payment_intent.succeeded'   => $this->handlePaymentSucceeded($data),
            'payment_intent.failed'      => $this->handlePaymentFailed($data),
            'customer.subscription.created' => $this->handleSubscriptionCreated($data),
            default => null,  // ignore unhandled event types
        };

        // Mark as processed
        $this->webhook->update(['status' => 'processed', 'processed_at' => now()]);
    }

    private function handlePaymentSucceeded(array $data): void
    {
        $order = Order::where('payment_intent_id', $data['id'])->firstOrFail();

        // Only update if not already paid — idempotent update
        if ($order->status !== 'paid') {
            $order->update(['status' => 'paid', 'paid_at' => now()]);
            $order->customer->notify(new OrderPaidNotification($order));
        }
    }

    public function failed(\Throwable $e): void
    {
        $this->webhook->update(['status' => 'failed', 'error' => $e->getMessage()]);
    }
}
⚠️ Always respond to the webhook provider immediately (within their timeout, e.g. 30s for Stripe). Store and queue — never process synchronously in the webhook endpoint. If your job fails and retries, the idempotency check prevents double-processing.

Large CSV / Excel Import

Importing 100,000 rows from a CSV is a textbook batch processing problem. Never do it in a single job or a single request. Split into chunks.

// App\Jobs\ImportCsvFile.php — the dispatcher job
class ImportCsvFile implements ShouldQueue
{
    use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    public int $timeout = 300;

    public function __construct(
        protected string $filePath,
        protected int    $importedBy
    ) {}

    public function handle(): void
    {
        $handle = fopen(Storage::path($this->filePath), 'r');
        $headers = fgetcsv($handle);  // skip header row
        $chunkSize = 500;
        $chunk = [];
        $batchJobs = [];

        while (($row = fgetcsv($handle)) !== false) {
            $chunk[] = array_combine($headers, $row);

            if (count($chunk) >= $chunkSize) {
                $batchJobs[] = new ProcessCsvChunk($chunk, $this->importedBy);
                $chunk = [];
            }
        }

        // Don't forget the last partial chunk
        if (! empty($chunk)) {
            $batchJobs[] = new ProcessCsvChunk($chunk, $this->importedBy);
        }

        fclose($handle);

        $batch = Bus::batch($batchJobs)
            ->name("CSV Import by user #{$this->importedBy}")
            ->allowFailures()
            ->finally(function (Batch $batch) {
                Storage::delete($this->filePath);  // cleanup temp file
            })
            ->onQueue('imports')
            ->dispatch();
    }
}

// App\Jobs\ProcessCsvChunk.php — processes one chunk
class ProcessCsvChunk implements ShouldQueue
{
    use Batchable, Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    public int $tries = 3;

    public function __construct(
        protected array $rows,
        protected int   $importedBy
    ) {}

    public function handle(): void
    {
        if ($this->batch()?->cancelled()) return;

        $toInsert = [];

        foreach ($this->rows as $row) {
            // Validate and transform each row
            if (empty($row['email']) || ! filter_var($row['email'], FILTER_VALIDATE_EMAIL)) {
                continue;  // skip invalid rows silently
            }

            $toInsert[] = [
                'name'       => trim($row['name'] ?? ''),
                'email'      => strtolower(trim($row['email'])),
                'created_by' => $this->importedBy,
                'created_at' => now(),
                'updated_at' => now(),
            ];
        }

        // Upsert to handle duplicates gracefully
        Contact::upsert($toInsert, ['email'], ['name', 'updated_at']);
    }
}

Conclusion

Real-world queue usage always follows the same pattern: dispatch fast, process smart, fail safely. The key principles across all these examples:

  • One job per unit of work — one email, one image, one row. Small jobs are independently retryable.

  • Never load all data upfront — always chunk. Use ->chunk() on Eloquent and stream large files.

  • Make jobs idempotent — especially for webhook processing. Running twice must not cause double effects.

  • Always handle the failed() method — update records, alert your team, clean up partial work.

  • Re-fetch models inside handle() — data may have changed since dispatch. Use $model->fresh().

  • Store files on shared storage — S3 or shared NFS, never local disk.

📑 On This Page