Avoiding unshift() for Better Performance in Javascript

Rehmat Sayany
4 min readOct 22, 2024

Performance becomes critical when working with large arrays in JavaScript, especially if your application processes data in real time or handles large datasets. One common task is moving a specific element to the beginning of an array. However, not all approaches to this problem are equally efficient.

In this blog post, we’ll explore an example of this operation using the unshift() method, and why it’s not the best approach. We’ll then introduce a more optimized solution that reduces memory overhead and improves performance.

The Problem: Moving an Array Element to the Front

Suppose you have an array of product variants, each identified by a unique SKU (Stock Keeping Unit). Your task is to move a specific SKU to the first position in the array whenever it matches a given selectedSku.

For example, given this array:

const variants = ['SKUA', 'SKUB', 'SKUC', 'SKUD', ...];

If we receive the SKU SKUC, we need to move it to the first position, resulting in:

['SKUC', 'SKUA', 'SKUB', 'SKUD', ...]

The Naïve Approach: Using unshift()

At first glance, it might seem that the unshift() method is perfect for this task. It adds an element to the beginning of an array, pushing all other elements one index further. Here’s a naive implementation:

let variants = [];

for (let i = 0; i < variants.length; i++) {
if (this.selectedSku && variants[i].sku === this.selectedSku) {
variants.unshift(variants[i]); // Move the matching variant to the front
} else {
variants.push(variants[i]); // Push non-matching variants to the end
}
}

While this code may seem simple, it’s highly inefficient for large arrays. Let’s explore why.

Why unshift() is Problematic

  1. Memory Overhead: whenever unshift() is called, it shifts all the elements of the array one index to the right, creating a copy of the array. This can cause significant memory overhead, especially when the array is large. If your array has thousands or millions of elements, this can lead to memory bloat and potentially crash your application.
  2. Performance Penalty: unshift() has a time complexity of O(n) because it needs to shift every existing element by one position. If you're doing this inside a loop, it can become O(n^2), which means the larger your array, the slower your code becomes. This is particularly harmful when processing large datasets in real time.

Optimizing the Approach: Iterating and Swapping

The key to optimization is to avoid shifting the entire array unnecessarily. Instead, we can:

  • Iterate over the array once to find the matching SKU.
  • Store the matching element temporarily.
  • Swap the matching element with the first element in the array after the loop completes.

This approach minimizes memory usage and reduces the time complexity to O(n).

Here’s the optimized code:

let skuFound = null; // Temporary variable to store the matching variant
// Iterate through the array and find the matching variant

for (let i = 0; i < variants.length; i++) {
if (this.selectedSku && variants[i].sku === this.selectedSku) {
skuFound = variants[i]; // Save the matching variant
break; // Exit the loop once the match is found
}
}
// If the matching variant was found, move it to the first position
if (skuFound !== null) {
// Push the current first element to the last position
variants.push(variants[0]);
// Set the first element to the saved matching variant
variants[0] = skuFound;
}

Why This Approach is Better

  1. No Unnecessary Copying: By only swapping the elements after the loop, we avoid copying the array multiple times. This reduces memory consumption and ensures that the array remains in place without bloating.
  2. Faster Performance: Since we avoid calling unshift() in every iteration, we don’t incur the O(n^2) time complexity. The optimized solution runs in O(n), which is significantly faster for larger datasets.
  3. Cleaner Code: The optimized approach results in cleaner and more readable code. It is easier to understand that we are simply finding the matching element and then swapping it at the end.

Performance Comparison: unshift() vs Optimized Approach

Let’s compare the performance of these two approaches using arrays of different sizes:

comparision between unshift() vs swapping

As the array size grows, the difference in performance becomes stark. The unshift() method may work fine for small arrays, but it quickly becomes a bottleneck as the dataset grows larger.

Conclusion: Optimize for Scalability

While unshift() might seem like a convenient method for adding elements to the front of an array, it’s not ideal for scenarios where you’re working with large datasets. The overhead of shifting elements and copying the array can severely impact both memory and performance.

By iterating through the array once, saving the matching element, and performing a simple swap at the end, we can dramatically improve the efficiency of this operation. This optimized approach ensures that your code scales well as your dataset grows, without sacrificing performance or memory usage.

When building performance-sensitive applications, always be mindful of the hidden costs of certain methods and look for opportunities to optimize. In the case of array manipulation, small tweaks can lead to significant gains!

--

--

Rehmat Sayany
Rehmat Sayany

Written by Rehmat Sayany

Full Stack developer @westwing passionate about NodeJS, TypeScript, React JS and AWS.

No responses yet