Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Big O of Finding the Intersection in two Unsorted Arrays Using filter() in JavaScript

I just started learning Big O notation and I'm trying to understand the Big O of different functions to see which one is better.

I'm struggling to calculate the time and space complexity for the following code.

function findCommonElem(arr1, arr2) { 
  let result = arr1.filter(x => arr2.includes(x));
  console.log(result);
}

  findCommonElem(arr1, arr2);

From what I understand, common array methods like filter() usually have a big O of O(n) so in this case, it'd be O(m+n) depending on the length of each array. However, I could be super wrong.

Can someone explain, please? Thanks so much!

Bonus question: Compared to sorting the arrays then using a while loop for the same function, which one would be considered as "better"?

like image 701
supvicky Avatar asked Oct 22 '25 16:10

supvicky


2 Answers

The above function has a time complexity of O(M * N).

But can you make this solution more efficient?

Yes. You can reduce it to O(M + N).

TLDR: Use a hash table to achieve linear time complexity O(M + N).


Approach 1

Check every elements of array 1 with every element of array 2. (This is the approach you're using.)

function findCommonElem(arr1, arr2) {
  return arr1.filter(x => arr2.includes(x));
}

const arr1 = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10];
const arr2 = [2, 4, 6, 8, 10, 12, 14, 16, 20];

console.log(findCommonElem(arr1, arr2)); // [2, 4, 6, 8, 10];
  • Time complexity = O(M * N)

    • Every element from array 1 is checked with every element of array 2 at the worst case. So, it's M * N.
  • Space compexity = O(M) or O(N)

    • At most, all of the elements from one of the either arrays could be in the intersection.

Approach 2

Use a hash map to linearize the nested loop. First, populate the hash map with array 1 elements. Then check through the array 2 using the map to find the intersection.

function findCommonElem(arr1, arr2) {
  const map = new Map();

  arr1.forEach(item => map.set(item, true));

  return arr2.filter(item => map.has(item));
}

const arr1 = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10];
const arr2 = [2, 4, 6, 8, 10, 12, 14, 16, 20];

console.log(findCommonElem(arr1, arr2)); // [2, 4, 6, 8, 10];

The above function returns the same output. But here's the difference - the nested loop is reduced to two linear loops for the arrays. This means both arrays are traversed only once.

  • Time complexity = O(M + N)

    • Array 1 is traversed once (M elements).
    • Array 2 is traversed once (N elements).
    • Checking if the map contains the element with map.has() takes constant time O(1).
    • Total run time = M + N
  • Space compexity = O(M) or O(N)

    • Space complexity is still the same here, because space needed for the new hash map is either O(M) or O(N). And our intermediate array takes either O(M) or O(N). Which is still the same.

Bonus: Don't know much about how hash maps work internally? Watch this.

Approach 3

Use set instead of map. The set data structure is best for this use case as you don't need the mapped value (the true value) in approach 2.

function findCommonElem(arr1, arr2) {
  const set = new Set(arr1);

  return arr2.filter(item => set.has(item));
}

const arr1 = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10];
const arr2 = [2, 4, 6, 8, 10, 12, 14, 16, 20];

console.log(findCommonElem(arr1, arr2)); // [2, 4, 6, 8, 10];

This uses relatively lesser space but the algorithmic complexity of TC and SC are still same.

  • Time complexity = O(M + N)
  • Space compexity = O(M) or O(N)

Thanks to Nick Parsons for pointing this out.

like image 51
kabirbaidhya Avatar answered Oct 25 '25 06:10

kabirbaidhya


let's say that arr1.length is n and arr2.length is m. so the filter function run the lambda function you wrote for each item in arr1. The lambda function checks if an item is in the arr2 and in the worst case where it didn't find it the function ran on all the array so m times. therefore arr1.filter(x => arr2.includes(x)) run at the worst case O(n*m).

as for the space complexity, the filter function creates a new array and in the worst case that array size is as big as the original so the space complexity is O(n)

like image 23
Saar Davidson Avatar answered Oct 25 '25 07:10

Saar Davidson



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!