Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Time complexity of a recursive function with two calls

Consider this code:

def count_7(lst):
    if len(lst) == 1:
        if lst[0] == 7:
            return 1
        else:
            return 0

    return count_7(lst[:len(lst)//2]) + count_7(lst[len(lst)//2:])

note: the slicing operations will be considered as O(1).

So, my inutation is telling me it's O(n*logn), but I'm struggling proving it scientifically.
Be glad for help!

like image 354
Daniel Gagnon Avatar asked Dec 29 '25 23:12

Daniel Gagnon


2 Answers

Ok, mathematically (sort of ;) I get something like this:

T(n) = 2T(n/2) + c
T(1) = 1

Generalizing the equation:

T(n) = 2^k * T(n/2^k) + (2^k - 1) * c
T(1) = 1

n/2^k == 1 when k == logN so:

T(n) = 2^logN * T(1) + (2^logN - 1) * c

since T(1) = 1 and applying logarithmic properties

T(n) = n * 1 + (n-1) * c
T(n) = n + n * c
T(n) = n * (1+c)
T(n) = O(n)

A clue that this is not O(n*logn) is that you don't have to combine the two subproblems. Unlike mergesort, where you have to combine the two sub arrays, this algorithm doesn't have to do anything with the recursive result, so its time can be expressed as constant c.

UPDATE: Intuition behind

This algorithm should be O(n) because you visit each element in the array only once. It may not seem trivial because recursion never is.

For example, you divide the problem in two subproblems half the size, each subproblems is then divided in half the size too and will keep going on until each subproblem is of size 1. When you finish, you'll have n subproblems of size 1, which is n*O(1) = O(n).

The path from the beginning of first problem until N problems of size 1 is logarithmic, because in each step you subdivide in two. But in each step you do nothing with the result so this doesn't add any time complexity to the solution.

Hope this helps

like image 168
Paulo Bu Avatar answered Dec 31 '25 14:12

Paulo Bu


The easiest way is to assume n is a multiple of 2 for simplicity: n = 2m

The time complexity of your algorithm is (c is a constant):

t(n) = 2 t(n/2) + c

And using recursion you get:

t(n) = 22 t(n/22) + 2c + c

     ...

     = 2log(n) t(n/2log(n)) + c(2log(n)-1 + ... + 22 + 21 + 20)

Which can be simplified by noticing that log(n) = m, and thus 2log(n) = 2m = n.

     = n + c(2log(n)-1 + ... + 22 + 21 + 20)

Finally, the sum above can be reduced to 2log(n) (which equals n)

 t(n) = (1 + c) n

So your solution is O(n)

like image 32
bcorso Avatar answered Dec 31 '25 14:12

bcorso



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!