Your Database is the Bottleneck
Had a Laravel app that took 4 seconds to load a dashboard. Turns out it was running 200+ queries. After optimization, same page loads in 200ms with 8 queries.
Here's what I learned.
The N+1 Problem
This is probably slowing down your app right now:
$posts = Post::all();
foreach ($posts as $post) {
echo $post->author->name; // Each iteration = another query
}
If you have 50 posts, that's 51 queries. One to get posts, fifty more to get each author.
Fix it with eager loading:
$posts = Post::with('author')->get();
foreach ($posts as $post) {
echo $post->author->name; // No extra queries
}
Now it's 2 queries total: one for posts, one for all the related authors.
Finding N+1 Queries
Install Laravel Debugbar for development:
composer require barryvdh/laravel-debugbar --dev
It shows you every query on each page load. Look for duplicate queries that only differ in the WHERE clause - that's your N+1.
Eager Load Nested Relationships
You can chain them:
$posts = Post::with(['author', 'comments.user'])->get();
This loads posts, their authors, all comments, and the users who wrote those comments. Still just a few queries.
Add Indexes Where It Matters
Every column you filter or sort on should probably have an index:
Schema::table('posts', function (Blueprint $table) {
$table->index('user_id'); // Foreign keys
$table->index('status'); // WHERE status = 'published'
$table->index('published_at'); // ORDER BY published_at
});
Composite indexes for common query patterns:
$table->index(['status', 'published_at']);
Selecting Only What You Need
Don't do this:
$users = User::all(); // Grabs every column
Do this:
$users = User::select(['id', 'name', 'email'])->get();
Especially if you have TEXT or BLOB columns you don't need.
Processing Large Datasets
Loading 100k records into memory is a bad idea:
// This will crash on large tables
User::all()->each(function ($user) {
// process
});
Use chunking instead:
User::chunk(1000, function ($users) {
foreach ($users as $user) {
// Process 1000 at a time
}
});
For simple operations, lazy collections work too:
User::lazy()->each(function ($user) {
// Streams results, low memory usage
});
Caching Expensive Queries
If something doesn't change often, cache it:
$popularPosts = Cache::remember('popular-posts', 3600, function () {
return Post::withCount('views')
->orderByDesc('views_count')
->take(10)
->get();
});
Invalidate when data changes:
// In your Post model
protected static function booted()
{
static::saved(fn() => Cache::forget('popular-posts'));
}
Raw Queries for Complex Stuff
Eloquent is great, but sometimes raw SQL is clearer and faster:
$stats = DB::select('
SELECT
DATE(created_at) as date,
COUNT(*) as total,
SUM(amount) as revenue
FROM orders
WHERE created_at >= ?
GROUP BY DATE(created_at)
', [now()->subDays(30)]);
Monitor in Production
Add slow query logging:
// In a service provider
if (!app()->environment('local')) {
DB::listen(function ($query) {
if ($query->time > 500) {
Log::warning('Slow query', [
'sql' => $query->sql,
'time' => $query->time . 'ms'
]);
}
});
}
Quick Wins
- Always eager load relationships you'll access
- Index foreign key columns
- Use pagination instead of loading everything
- Cache queries that run on every request
- Check query count on key pages regularly
Most performance problems are solved by fixing N+1 queries and adding missing indexes. Start there.
