Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Memory Allocation in std::map

I am doing a report on the various C++ dictionary implementations (map, dictionary, vectors etc).

The results for insertions using a std::map illustrate that that the performance is O(log n). There are also consistent spikes in the performance. I am not 100% sure what's causing this; I think they are caused by memory allocation but I have been unsuccessful in finding any literature / documentation to prove this.

Can anyone clear this matter up or point me in the right direction?

Cheers.


2 Answers

You are right: it is O(log n) complexity. But this is due to the sorted nature of map (normally binary tree based).

Also see http://www.sgi.com/tech/stl/UniqueSortedAssociativeContainer.html there is a note on insert. It’s worst case is O(log n) and amortized O(1) if you can hint where to do the insert.

Maps are normally based on binary trees and need to be balanced to keep good performance. The load spikes you are observing probably correspond to this balancing process

like image 69
kmkaplan Avatar answered Oct 27 '25 20:10

kmkaplan


The empirical approach isn't strictly necessary when it comes to STL. There's no point in experimenting when the standard clearly dictates the minimal complexity of operations such as std::map insertion.

I urge you to read the standard so you're aware of the minimal complexity guarantees before continuing with experiments. Of course, there might be bugs in whatever STL implementation you happen to be testing; but the popular STLs are pretty well-debugged creatures and very widely used, so I'd doubt it.

like image 28
Assaf Lavie Avatar answered Oct 27 '25 19:10

Assaf Lavie