Bubble sort (sometimes called sinking sort) is usually the first sorting algorithm you'll encounter when learning to code. It's simple, easy to wrap your head around, stable, and in-place. The catch? Its quadratic time complexity makes it painfully slow for large datasets.
Bubble sort (sometimes called sinking sort) is usually the first sorting algorithm you’ll encounter when learning to code. It’s simple, easy to wrap your head around, stable, and in-place. The catch? Its quadratic time complexity makes it painfully slow for large datasets. But here’s the thing—understanding bubble sort gives you a solid mental model for how comparison-based sorting actually works: swaps, passes, and complexity analysis.
Real-World Analogy: Picture a line of people standing in random order. You want them arranged from shortest to tallest. Someone walks down the line comparing each pair of neighbors. If the person on the left is taller, they swap places. After one complete walk-through, the tallest person will have “bubbled up” to the far right. Keep repeating this process, and eventually everyone stands in perfect height order. That’s bubble sort in a nutshell.
At its heart, bubble sort repeatedly scans through the array and swaps neighbors that are out of order.
For an array of length n (sorting in ascending order):
0.arr[j] and arr[j + 1].arr[j] > arr[j + 1], swap them.j forward until you hit the end of the unsorted portion.n - 1.After k passes, the last k elements are locked in place.
Key Concept: Bubble sort only swaps adjacent elements. This is what makes it stable (equal elements keep their relative order) and dead simple to implement in-place.
Use this interactive visualizer to watch how basic (non-optimized) bubble sort moves through an array, pass by pass and swap by swap.
As you step through it, notice:
[0..n-1], then [0..n-2], and so on).Let’s sort [5, 3, 8, 4, 2] in ascending order using the basic algorithm.
Pass 1 (compare indices 0 to 3):
(5, 3) → swap → [3, 5, 8, 4, 2](5, 8) → no swap needed → [3, 5, 8, 4, 2](8, 4) → swap → [3, 5, 4, 8, 2](8, 2) → swap → [3, 5, 4, 2, 8]8 is now locked at the end.Pass 2 (compare indices 0 to 2):
(3, 5) → already good(5, 4) → swap → [3, 4, 5, 2, 8](5, 2) → swap → [3, 4, 2, 5, 8]5 is now fixed at index 3.Pass 3 (compare indices 0 to 1):
(3, 4) → looks good(4, 2) → swap → [3, 2, 4, 5, 8]4 is locked at index 2.Pass 4 (compare index 0):
(3, 2) → swap → [2, 3, 4, 5, 8]Done. Array sorted.
The basic version has a glaring inefficiency—it keeps running passes even after the array’s already sorted. We can fix this with a simple trick: exit early if no swaps happen.
The Strategy:
swapped = false.swapped = true.swapped is still false? You’re done. The array’s sorted.This single optimization drops the best-case time complexity from O(n²) down to O(n) (when the input’s already sorted).
If no swaps occur in a pass, the array is already sorted
After each pass, the largest unsorted element is in its final position
Elements after the last swap are already sorted
Try running the optimized version on:
You’ll see that for sorted or nearly sorted inputs, it stops after just a few passes. But for reverse-sorted arrays? It still grinds through all those quadratic comparisons.
Here’s production-ready, optimized bubble sort in three languages.
public class BubbleSort {
public static void sort(int[] arr) {
int n = arr.length;
boolean swapped;
// Outer loop runs at most n-1 passes
for (int i = 0; i < n - 1; i++) {
swapped = false;
// Last i elements are already in place
for (int j = 0; j < n - i - 1; j++) {
if (arr[j] > arr[j + 1]) {
// Swap adjacent elements
int temp = arr[j];
arr[j] = arr[j + 1];
arr[j + 1] = temp;
swapped = true;
}
}
// If no elements were swapped, array is sorted
if (!swapped) {
break;
}
}
}
public static void main(String[] args) {
int[] arr = {64, 34, 25, 12, 22, 11, 90};
sort(arr);
System.out.print("Sorted array: ");
for (int num : arr) {
System.out.print(num + " ");
}
}
} Implementation notes:
swapped flag and a shrinking inner-loop range.Worst case: O(n²)
n - 1 passes.O(n²) comparisons total.Average case: O(n²)
Best case (with optimization): O(n)
n - 1 comparisons, zero swaps, checks swapped == false, and bails out.Time Complexity: Bubble sort clocks in at O(n²) for average and worst cases because of those nested loops. It only hits O(n) on sorted input when you’ve got the early-exit optimization in place.
i, j, swapped, and a temp for swapping).O(1) auxiliary space.In the real world, bubble sort gets absolutely crushed by O(n log n) algorithms like Quicksort, Mergesort, or Timsort (Python’s built-in sort).
Use this visualization to see how bubble sort’s performance tanks compared to efficient alternatives as your dataset grows.
What you’ll typically see:
n <= 10), constant factors matter more than Big-O, so bubble sort can be “fast enough” while being trivial to write.n grows? That n² becomes a nightmare compared to n log n.Look, you shouldn’t reach for bubble sort in production for anything beyond trivial data sizes. But it’s still useful in specific situations:
Education and Interviews:
Tiny Collections:
Nearly Sorted Data:
O(n)).Restricted Environments:
n is small, the simplicity can be a genuine advantage.Edge Cases and Pitfalls:
i in the inner loop (n - i - 1) and you’ll waste time re-comparing already sorted elements.swapped flag and your best-case stays at O(n²). Ouch.j <= n - i - 1 instead of j < n - i - 1 will cause an out-of-bounds crash when checking arr[j+1].< instead of >) and suddenly you’re sorting in reverse.Bubble Sort is inherently O(n²), but you can squeeze out some improvements:
n - i - 1 to skip the sorted tail.O(n²) though.O(n²) algorithm, Insertion Sort is usually faster in practice because it does fewer writes.Practical Tip: Think of bubble sort as your gateway drug. Once you’re comfortable with it, move on to insertion sort, then selection sort, and finally tackle quicksort or mergesort to see how they compare.
O(n²)O(n)O(1)Mastering bubble sort isn’t about using it in real projects. It’s about building a rock-solid foundation in how sorting actually works under the hood. Once you’ve got that down, you’ll find it way easier to understand and implement more sophisticated algorithms.