The answer for problem 8.3-4:
- 8.3-4 (from CLRS) Show how to sort n integers in range 0 to n^2 − 1 in O(n) time.
The answer:
First take O(n) time to convert the integers into 2-digits numbers base n ....
It assumes that we can get convert n
integers in the range 0 to n^2-1
to base n
in O(n)
time ?
How is this possible ?
Shouldn't each conversion take O(log(n)) time and hence for n conversions the time should be O(nlogn) rather than O(n) ?
I assume the author meant each integer arithmetic is done in O(1)
time, so converting a number to base n
is basically most significant digit: x/n
, lease significant digit: x%n
, which under the above assumption is done in constant time.
Without the given assumption, each number needs log(n^2)=2log(n)
bits to be represented, so only reading the input is going to take O(nlogn)
time, so this assumption is needed to meet the time requirement.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With