Imagine you have a bunch of NSArrays. The arrays all contain CGPoints wrapped in NSValues. All of the elements are not unique. So certain elements can appear in multiple arrays. What's the fastest way combine these arrays into a single array such that the resulting array contains just the unique elements?
Currently I'm doing it like this:
setByAddingObjectsFromArrayAnother option is this:
The traditional runtime analysis says the first option should scale with O(n log n) where n is the number of elements in all the initial arrays (traverse all elements and insert each into a binary search tree or similar in log time). For the second approach the runtime is O(n) since lookup and insertion into the hash table runs in amortized constant time.
However after reading a bit about Apple's data structures it seems foolish to assume they behave like traditional data structures.
Since both of these approaches will be somewhere between O(n) and O(n log n) and your n is likely small enough that log n is bounded and small, constant factors are likely to decide which of these is the fastest.
At this point, actually benchmarking with data that looks like your use-case is the best thing to do.
I believe, but am not certain, that the NSSet is implemented with a hash as well.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With