Reducing problem A to problem B means that problem B is at least as hard as A, if not even more so.
If I can reduce sorting to some other problem X, I know that X has a lower-bound of Omega(n log n). Is that lower-bound guaranteed to be a tight lower-bound? I suspect that it shouldn't be, because X is only known to be at least as hard as A -- implying that it could be harder, and have a different lower-bound as a result.
I mean that in the sense that it's correct to say that because insertion sort has a worst-case tight upper-bound of O(n^2), it's also correct to say it has a worst-case running time of O(n^3). It's correct, but not of much practical value -- because we're interested in tight bounds 99% of the time.
You are absolutely right, the bound does not need to be tight.
For instance, consider a simple example: finding the smallest integer in an array of n integers. There is an O(1) space and O(n) time algorithm to solve this problem every time. However, this problem reduces to the problem of sorting an array of integers via a reduction in O(1) both ways:
MININT input to SORTINT input: use MININT's input directly for SORTINT.SORTINT output to MININT output: return the first element of the sorted array (assuming elements are sorted in ascending order).Sorting certainly does have a lower bound of Omega(n) on worst-case inputs. This is not a tight bound; Omega(n lg n) is tighter for SORTINT. But the reduction of MININT to SORTINT, by itself, does not tell us that.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With