Disclaimer: As of now, I have not found documentation that explicitly supports the type inference mechanism discussed in this article. This is based on my understanding and experience with TypeScript. Readers are encouraged to explore further and refer to the official TypeScript documentation for additional information.
TL;DR
The NoInfer<T>
utility type in TypeScript blocks type inference, ensuring types are matched explicitly. It works by using a type parameter T in a way that prevents automatic type inference.
NoInfer
- the New Utility Type
I recently came across an awesome video and article by Matt introducing the NoInfer
utility type and its mechanism.
According to Matt's article and video, a self-implemented version of NoInfer
is:
type NoInfer<T> = [T][T extends any ? 0 : never];
I wondered why this works?
Dive in with Examples
I asked my friend Jiar, and he explained that it has to do with TypeScript's type inference mechanism. What NoInfer
does is to block the type inference. He gave me this example:
type NoInfer<T> = [T][T extends any ? 0 : never];
function testNoInfer<T extends string>(args: T[], noInferArgs: NoInfer<T>): boolean {
return args.includes(noInferArgs);
}
testNoInfer(
['lets', 'go'],
/** @ts-expect-error */
'somewhre'
);
testNoInfer(
['people', 'help'],
'people',
)
Does the Order of the Parameters Matter?
To understand what "blocking" means, let's change the order of the parameters in the testNoInfer
function:
function testNoInfer<T extends string>(noInferArgs: NoInfer<T>, args: T[]): boolean {
return args.includes(noInferArgs);
}
testNoInfer(
/** @ts-expect-error */
'somewhre',
['lets', 'go'],
);
testNoInfer(
'people',
['people', 'help'],
)
You see, the order doesn't matter-- it still works.
A Flash of Inspiration
What if we leave the noInferArgs
argument alone?
function testNoInfer<T extends string>(noInferArgs: NoInfer<T>): boolean {
return Boolean(noInferArgs);
}
testNoInfer(
'somewhre'
);
testNoInfer(
'people'
)
No error! I knew it! The type NoInfer
is just the type parameter T
itself!
So why does defining the other parameter change everything?
Conclusion
In my opinion, the "block" occurs because TypeScript checks the args
with plain T
and then references this T
in the NoInfer type. Therefore, when another parameter is defined with plain T
, TypeScript insists that T
should match what args
received.
Top comments (0)