Unfortunately this code doesn't work:
#include <iostream>
#include <type_traits>
#include <utility>
#include <tuple>
template<typename FirstArg, typename ... Args>
requires (sizeof ... (Args) == 0 || (std::is_convertible_v<Args ..., FirstArg>))
constexpr void multi_max( FirstArg firstArg, Args const &... args )
{
using namespace std;
FirstArg &max = firstArg;
auto findMax = [&]<size_t ... Is>( index_sequence<Is ...> iseq, tuple<Args ...> tArgs )
{
((max = get<Is>( tArgs ) > max ? get<Is>( tArgs ) : max), ...);
};
findMax( make_index_sequence<sizeof ... (Args)>(), make_tuple( args ... ) );
}
int main()
{
multi_max( 1 );
//multi_max( 1, 2, 3 );
}
The commented section doesn't fulfill the right half of the requires-constraint. Why ? And if I remove the first constraint the compilers complains about wrong unpacking of the args into a tuple.
Looking at the relevant portion of the compiler error:
note: because substituted constraint expression is ill-formed: too many template arguments for variable template 'is_convertible_v'
You're using is_convertible_v<int, int, int>
. What you want is all tests of individual arguments to be true, which you can do with a fold expression:
requires (sizeof ... (Args) == 0 || (std::is_convertible_v<Args, FirstArg> && ...))
However, for a pack of 0 elements, the fold expression will produce true
for &&
, so the first part is now redundant:
requires (std::is_convertible_v<Args, FirstArg> && ...)
Cleaning up to use the concept convertible_to
instead of the old type trait:
requires (std::convertible_to<Args, FirstArg> && ...)
Now things are looking a little suspiciously convenient. We can actually move this to the template parameter and get rid of the requires
entirely:
template<typename FirstArg, std::convertible_to<FirstArg> ... Args>
This placeholder will shove FirstArg
in as the second argument and place the given type as the first, so a given T
would test std::convertible_to<T, FirstArg>
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With