I am a Computational Fluid Dynamist (CFD), but I dont know mpi very well.
Since heavy CFD jobs requires Infiniband support, and people say that mvapich is usually much better than other mpi implementations. I want to know is this true? Any one has any real experience and references that I can look at? And how come that mvapich is better than the openmpi, etc.? Is it written by Infiniband company or what?
Thanks a lot!
So the answer is "probably not, and it doesn't matter anyway".
You write your code using the MPI API, and you can always install multiple MPI libraries and test against each, as you might with several LAPACK implementations. If one's consistently faster for your application, use it. But the MPI community is very concerned with performance, and the free competitors are all open source and publish their methods in papers, and all publish lots of benchmarks. The friendly rivalary, combined with the openness, tends to mean that no implementation has significant performance advantages for long.
On our big x86 cluster we've done "real world" and micro benchmarks with MPICH2, OpenMPI, MVAPICH2, and IntelMPI. Amongst the three open-source versions, there was no clear winner; on some cases one would win by 10-20%, on others it would lose by the same amount. On those few occassions where we were interested enough to dig into the details to find out why, it was often just a matter of defaults for things like eager limits or crossover between different collective algorithms, and by setting a couple of environment variables we got performance within noise between them. In other cases, a performance advantage persisted but was either not large enough or not consistent enough for us to investigate further.
(IntelMPI, which costs significant amounts of money, was noticibly and mostly-consistently faster, although what we consider the big win there was substantially improved startup times for very large jobs.)
MVAPICH was one of the first MPI implementations to really go after Infiniband performance, after having lots of experience with Myrinet, and they did have a significant advantage there for quite some time, and there are probably benchmarks in which they still win; but ultimately there was no consistent and important performance win and we went with OpenMPI for our main Open Source MPI option.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With