This doesn't work as expected:
template<typename T>
struct PHI {
enum : T { value = 11400714819323198485 >> (64 - sizeof(T) * 8) };
};
std::cout << PHI<unsigned long long>::value;
The output is 2135587861. What I expected is 11400714819323198485.(In VS2013)
I thought PHI<unsigned long long>::value would implicitly convert to type unsigned long long if needed. But it actually convert to unsigned. That means when I used it in some other place, it might convert to unsigned too. That's not what I want.
Let's get rid of the templates. Minimized repro:
#include <iostream>
enum bar : unsigned long long { baz = 11400714819323198485ULL };
void foo(int v) { std::cout << "int "<< v; }
void foo(unsigned v) { std::cout << "uint " << v; }
void foo(unsigned long long v) { std::cout << "ull " << v; }
int main() { foo(baz); }
This prints uint 2135587861.
Meanwhile,
#include <iostream>
enum bar : unsigned long long { baz = 11400714819323198485ULL };
void foo(unsigned long long v) { std::cout << "ull " << v; }
int main() { foo(baz); }
prints ull 11400714819323198485. So the value is preserved, and the conversion can be done. This looks like a bug in VC++'s overload resolution. This also reproduces in VS2015 CTP5.
Edit: reported as https://connect.microsoft.com/VisualStudio/feedback/details/1131433
The shortest workaround appears to be using foo(+baz); the unary + forces an integral promotion before overload resolution is performed.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With