Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Identifying source of and understanding OpenBLAS and OpenMP warnings

I am developing a deep learning model using pytorch, pytorch-lightning, and segmentation-models-pytorch. When I run pytorch_lightning.Trainer.fit(), I get hundreds of the following warning:

OpenBLAS Warning : Detect OpenMP Loop and this application may hang. Please rebuild the library with USE_OPENMP=1 option.

I have the following questions:

  1. How can I identify what part of my code or the source code is raising this warning?
  2. How can I assess whether this warning is relevant or can be ignored?
  3. If I decide I can ignore the warning, how can I suppress the warning?

I am familiar with handling warnings through python's warnings module. However, this doesn't help here because the warning is coming from OpenBLAS, which is not a python library.

There are several other questions about how to fix the problem that causes this warning, e.g. here and here. My question is about understanding the source of the warning, deciding whether I care about it, and suppressing it if I don't care.

Thanks in advance for any tips or answers to the above questions. Apologies if these are silly or poorly formulated questions, as I am completely unfamiliar with OpenBLAS and OpenMP.

like image 349
sdg Avatar asked Oct 27 '25 06:10

sdg


1 Answers

OpenBLAS is a low-level library providing fast implementation of (most if not all) linear algebra operations, and OpenMP provides primitives for parallel computing on shared memory machines.

Pytorch, as well as other ML/sci-computing libraries, uses one or both libraries. For instance, Numpy is based on openBLAS which guarantees very fast matrix operations, scikit-learn uses OpenMP for executing parallel jobs.

One problem is that OpenBLAS is built using OpenMP. Numerical computing libraries require for building specific versions of OpenBLAS (some specific build configuration for OpenMP) which can conflicts with other libraries that may depend also on OpenMP for their build.

Here is more information about how these dependencies are managed across different package distros: https://pypackaging-native.github.io/key-issues/native-dependencies/blas_openmp/

According to the same source, PyTorch makes use of OpenMP and do not specify explicitly which openMP instance to use or how to use it. This conflicts with other multi-threading libraries and can cause building issues.

The warning raised by OpenBLAS in your case means the openBLAS detects multiple OpenMP threads which is unexpected (see https://github.com/open-ce/pytorch-feedstock/issues/34 and https://github.com/pytorch/pytorch/issues/52047,https://github.com/OpenMathLib/OpenBLAS/issues/2197).

A solution which is proposed to solve this problem from PyTorch side (since it is the root cause in our case) is to tell PyTorch to use another available parallel computing dependency called TBB (see https://pytorch.org/docs/stable/notes/cpu_threading_torchscript_inference.html). It goes by setting USE_TBB=1 when building Torch.

like image 72
inarighas Avatar answered Oct 28 '25 20:10

inarighas



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!