NSUInteger index = [self.objects indexOfObject:obj];
if (index == NSNotFound) {
// Success! Note: NSNotFound internally uses NSIntegerMax
}
if (index == NSUIntegerMax) {
// Fails!
}
Why? I'm suppose to get an unsigned value as a result of indexOfObject. So, naturally, i was assuming that if the object is not found, it will return NSUIntegerMax instead of NSIntegerMax. Is this a bug, or is there a logical explanation for this behavior.
Perhaps NSNotFound can be used in contexts that use NSInteger. It is also safer, in case someone declares index as NSInteger instead of NSUInteger.
At most, one could say that it's odd that NSNotFound is defined as NSIntegerMax, but it's certainly not a bug.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With