Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to specialize a template function for different data types in which the procedures are similar?

Tags:

c++

templates

For example, I want to implement a matrix multiplication template function using AVX2. (Suppose "Matrix" is a well implemented template class)

Matrix<T> matmul(const Matrix<T>& mat1, const Matrix<T>& mat2) {
    if (typeid(T).name() == typeid(float).name()) {
        //using __m256 to store float
        //using __m256_load_ps __m256_mul_ps __m256_add_ps
    } else if (typeid(T).name() == typeid(double).name()) {
        //using __m256d to store double
        //using __m256d_load_pd __m256d_mul_pd __m256d_add_pd
    } else {
        //...
    }
}

As there is no "variable" for data types, the program can't determine whether it should use __m256 or __m256d or anything else, thus making the code very long and awkward. Is there another way to avoid this?

like image 296
Haoson Q Avatar asked Dec 01 '25 12:12

Haoson Q


1 Answers

In C++17 and later, you can use if constexpr:

#include <type_traits>

Matrix<T> matmul(const Matrix<T>& mat1, const Matrix<T>& mat2) {
    if constexpr (std::is_same_v<T, float>) {
        //using __m256 to store float
        //using __m256_load_ps __m256_mul_ps __m256_add_ps
    } else if constexpr (std::is_same_v<T, double>) {
        //using __m256d to store double
        //using __m256d_load_pd __m256d_mul_pd __m256d_add_pd
    } else {
        //...
    }
}

Otherwise, just use overloads:

Matrix<float> matmul(const Matrix<float>& mat1, const Matrix<float>& mat2) {
    //using __m256 to store float
    //using __m256_load_ps __m256_mul_ps __m256_add_ps
}

Matrix<double> matmul(const Matrix<double>& mat1, const Matrix<double>& mat2) {
    //using __m256d to store double
    //using __m256d_load_pd __m256d_mul_pd __m256d_add_pd
}

...
like image 101
Remy Lebeau Avatar answered Dec 03 '25 01:12

Remy Lebeau



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!