I just started learning Big O notation and I'm trying to understand the Big O of different functions to see which one is better.
I'm struggling to calculate the time and space complexity for the following code.
function findCommonElem(arr1, arr2) {
let result = arr1.filter(x => arr2.includes(x));
console.log(result);
}
findCommonElem(arr1, arr2);
From what I understand, common array methods like filter() usually have a big O of O(n) so in this case, it'd be O(m+n) depending on the length of each array. However, I could be super wrong.
Can someone explain, please? Thanks so much!
Bonus question: Compared to sorting the arrays then using a while loop for the same function, which one would be considered as "better"?
The above function has a time complexity of O(M * N).
But can you make this solution more efficient?
Yes. You can reduce it to O(M + N).
TLDR: Use a hash table to achieve linear time complexity O(M + N).
Check every elements of array 1 with every element of array 2. (This is the approach you're using.)
function findCommonElem(arr1, arr2) {
return arr1.filter(x => arr2.includes(x));
}
const arr1 = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10];
const arr2 = [2, 4, 6, 8, 10, 12, 14, 16, 20];
console.log(findCommonElem(arr1, arr2)); // [2, 4, 6, 8, 10];
Time complexity = O(M * N)
Space compexity = O(M) or O(N)
Use a hash map to linearize the nested loop. First, populate the hash map with array 1 elements. Then check through the array 2 using the map to find the intersection.
function findCommonElem(arr1, arr2) {
const map = new Map();
arr1.forEach(item => map.set(item, true));
return arr2.filter(item => map.has(item));
}
const arr1 = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10];
const arr2 = [2, 4, 6, 8, 10, 12, 14, 16, 20];
console.log(findCommonElem(arr1, arr2)); // [2, 4, 6, 8, 10];
The above function returns the same output. But here's the difference - the nested loop is reduced to two linear loops for the arrays. This means both arrays are traversed only once.
Time complexity = O(M + N)
map.has() takes constant time O(1).Space compexity = O(M) or O(N)
O(M) or O(N). And our intermediate array takes either O(M) or O(N). Which is still the same.Bonus: Don't know much about how hash maps work internally? Watch this.
Use set instead of map. The set data structure is best for this use case as you don't need the mapped value (the true value) in approach 2.
function findCommonElem(arr1, arr2) {
const set = new Set(arr1);
return arr2.filter(item => set.has(item));
}
const arr1 = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10];
const arr2 = [2, 4, 6, 8, 10, 12, 14, 16, 20];
console.log(findCommonElem(arr1, arr2)); // [2, 4, 6, 8, 10];
This uses relatively lesser space but the algorithmic complexity of TC and SC are still same.
O(M + N)O(M) or O(N)Thanks to Nick Parsons for pointing this out.
let's say that arr1.length is n and arr2.length is m.
so the filter function run the lambda function you wrote for each item in arr1. The lambda function checks if an item is in the arr2 and in the worst case where it didn't find it the function ran on all the array so m times.
therefore arr1.filter(x => arr2.includes(x)) run at the worst case O(n*m).
as for the space complexity, the filter function creates a new array and in the worst case that array size is as big as the original so the space complexity is O(n)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With