Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why are missing type parameters inferred as unknown in TypeScript?

Why does a generic type parameter in a call get inferred to the unknown type (or the constraint type) if it is omitted. Consder,

function doStuff<T>(): T {
  return {} as any as T;
}

const result = doStuff();

I would expect the call of doStuff to be an error as the type parameter is missing. Instead it infers unknown so the type of result is unknown. Why? If T has a constraint then the type of result is the constraint type.

I can understand defaults being useful but TypeScript has a defaulting mechanism for generic parameters. Is this an historical hang-up or what is the thinking?

I'm using TypeScript 3.9.

This is a similar quesition to this but I am asking the question why (not assuming it is incorrect) and this example is simplier.

like image 999
Eamonn Boyle Avatar asked Oct 15 '25 13:10

Eamonn Boyle


1 Answers

It's documented in the (increasingly outdated) TypeScript Spec here that "if the set of candidate argument types is empty, the inferred type argument for T is T's constraint." So it's intended, but you want to know why. I assume you don't want my opinion about why it might be useful behavior not to issue an error instead, so the best I can do is see if there's any documented discussion by the TS team about this.

Minor aside: In TypeScript 3.5, the implicit constraint for generic type parameters was changed from {} to unknown. That's just some context for the only issues I can find about this:

There's microsoft/TypeScript#360, asking for an error to be issued if type inference produces {}. The issue description talks mostly about the situation when there are multiple non-overlapping type inference candidtates and the compiler widens all the way to the top-like type in order to find a "best common supertype". This is not your problem, and this issue has already been addressed in TypeScript by using unions and preventing union inference if the types are unrelated.

But in this comment, the discussion switches to what to do when there are no candidates. Should it be an error? Aaand, that's about as much as I can find in that issue. The issue was resolved without erroring on an empty set of candidates, and the suggestion was closed as "declined".

There's also microsoft/TypeScript#2511 which asks for this again. It's mentioned in this comment as option 2: "Give an error when {} is inferred because there are no inference candidates. We have discussed this option before, and it never really gained much traction, but we could revisit." And maybe some work was even done on it in a now-deleted branch called downWithDreadedCurlyCurly. But the issue ends up getting closed since a tangentially related issue fixes part of this (using contextual types to add a candidate from return types).

And finally there's microsoft/TypeScript#5254, asking for this again, some discussion goes back and forth, and nothing comes of it.

So, that's it. It's intended; some people have thought it should be different a while ago; the idea didn't get much traction; it was abandoned. None of this really says why the current behavior is preferred; the answer may well be inertia; it works well enough for people and those who had a problem with it had their problem addressed in other ways.

If someone can find a more canonical answer to this, I'd be interested in seeing it. Good luck!

like image 125
jcalz Avatar answered Oct 19 '25 17:10

jcalz