Below, the purpose of the code is to compute power of an integer. My friend told me that the time complexity of this algorithm is O(log n). But, in fact the number of function calls is not equal to logn. For example, power(2, 9) calls power functions 5 times (including the calling power(2,9)), while power(2, 8) calls power function 4 times (including the calling power(2,8). Nevertheless the number of bits needed for 8 and 9 are same, the numbers of function calls are different.
Why does this happen? Is this really O(log n) algorithm?
#include <stdio.h>
int power(int a, int n) {
if(n == 0) {
return 1;
}
if(n == 1) {
return a;
}
if (n%2 == 0) {
return power(a*a, n/2);
}else{
return a * power(a, n - 1);
}
}
int main() {
for (int i = 0; i < 15; i++)
printf("pow(%d, %d) = %d\n", 2, i, power(2, i));
return 0;
}
Your implementation is O(logN), but it could be made slightly more efficient.
Note that hereafter, a log is a log base 2.
You have log(n) calls of power(a*a,n/2)
, and a call to power(a, n-1)
for every bit set in n.
The number of bits set in n is at most log(n) +1.
Thus, the number of calls to power
is at most log(n)+log(n)+1. For instance, when n = 15, the sequence of calls is
power(15), power(14), power(7), power(6), power(3), power(2), power(1)
log(n)+log(n)+1 = 3+3+1 = 7
Here is a more efficient implementation that has only log(n)+2 calls of power
.
int power(int a, int n) {
if(n == 0) {
return 1;
}
if (n&1 == 0) {
return power(a*a, n/2);
}else{
return a * power(a*a, n/2);
}
}
In this case the sequence of calls when n = 15 is
power(15), power(7), power(3), power(1), power(0)
I removed the if (n == 1)
condition because we can avoid this test that would be performed log(n) time by adding one call to power
.
We then have log(n)+2 calls to power which is better than 2log(n)+1.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With